[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 13040 1726882402.02008: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Xyq executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 13040 1726882402.02455: Added group all to inventory 13040 1726882402.02458: Added group ungrouped to inventory 13040 1726882402.02462: Group all now contains ungrouped 13040 1726882402.02467: Examining possible inventory source: /tmp/network-91m/inventory.yml 13040 1726882402.21784: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 13040 1726882402.21827: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 13040 1726882402.21843: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 13040 1726882402.21889: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 13040 1726882402.21953: Loaded config def from plugin (inventory/script) 13040 1726882402.21955: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 13040 1726882402.22009: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 13040 1726882402.22127: Loaded config def from plugin (inventory/yaml) 13040 1726882402.22129: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 13040 1726882402.22473: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 13040 1726882402.22907: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 13040 1726882402.22910: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 13040 1726882402.22913: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 13040 1726882402.22930: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 13040 1726882402.22936: Loading data from /tmp/network-91m/inventory.yml 13040 1726882402.23041: /tmp/network-91m/inventory.yml was not parsable by auto 13040 1726882402.23138: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 13040 1726882402.23245: Loading data from /tmp/network-91m/inventory.yml 13040 1726882402.23338: group all already in inventory 13040 1726882402.23345: set inventory_file for managed_node1 13040 1726882402.23349: set inventory_dir for managed_node1 13040 1726882402.23350: Added host managed_node1 to inventory 13040 1726882402.23356: Added host managed_node1 to group all 13040 1726882402.23357: set ansible_host for managed_node1 13040 1726882402.23358: set ansible_ssh_extra_args for managed_node1 13040 1726882402.23365: set inventory_file for managed_node2 13040 1726882402.23369: set inventory_dir for managed_node2 13040 1726882402.23370: Added host managed_node2 to inventory 13040 1726882402.23371: Added host managed_node2 to group all 13040 1726882402.23374: set ansible_host for managed_node2 13040 1726882402.23375: set ansible_ssh_extra_args for managed_node2 13040 1726882402.23378: set inventory_file for managed_node3 13040 1726882402.23380: set inventory_dir for managed_node3 13040 1726882402.23381: Added host managed_node3 to inventory 13040 1726882402.23382: Added host managed_node3 to group all 13040 1726882402.23383: set ansible_host for managed_node3 13040 1726882402.23384: set ansible_ssh_extra_args for managed_node3 13040 1726882402.23387: Reconcile groups and hosts in inventory. 13040 1726882402.23390: Group ungrouped now contains managed_node1 13040 1726882402.23392: Group ungrouped now contains managed_node2 13040 1726882402.23394: Group ungrouped now contains managed_node3 13040 1726882402.23511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 13040 1726882402.23679: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 13040 1726882402.23732: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 13040 1726882402.23774: Loaded config def from plugin (vars/host_group_vars) 13040 1726882402.23777: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 13040 1726882402.23784: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 13040 1726882402.23792: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 13040 1726882402.23870: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 13040 1726882402.24234: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882402.24310: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 13040 1726882402.24333: Loaded config def from plugin (connection/local) 13040 1726882402.24335: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 13040 1726882402.24687: Loaded config def from plugin (connection/paramiko_ssh) 13040 1726882402.24690: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 13040 1726882402.25303: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 13040 1726882402.25326: Loaded config def from plugin (connection/psrp) 13040 1726882402.25328: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 13040 1726882402.25743: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 13040 1726882402.25772: Loaded config def from plugin (connection/ssh) 13040 1726882402.25774: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 13040 1726882402.27818: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 13040 1726882402.27897: Loaded config def from plugin (connection/winrm) 13040 1726882402.27899: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 13040 1726882402.27922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 13040 1726882402.27987: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 13040 1726882402.28037: Loaded config def from plugin (shell/cmd) 13040 1726882402.28038: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 13040 1726882402.28058: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 13040 1726882402.28100: Loaded config def from plugin (shell/powershell) 13040 1726882402.28101: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 13040 1726882402.28137: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 13040 1726882402.28257: Loaded config def from plugin (shell/sh) 13040 1726882402.28258: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 13040 1726882402.28284: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 13040 1726882402.28471: Loaded config def from plugin (become/runas) 13040 1726882402.28473: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 13040 1726882402.28585: Loaded config def from plugin (become/su) 13040 1726882402.28587: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 13040 1726882402.28685: Loaded config def from plugin (become/sudo) 13040 1726882402.28687: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 13040 1726882402.28709: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_initscripts.yml 13040 1726882402.28929: in VariableManager get_vars() 13040 1726882402.28944: done with get_vars() 13040 1726882402.29034: trying /usr/local/lib/python3.12/site-packages/ansible/modules 13040 1726882402.31652: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 13040 1726882402.31778: in VariableManager get_vars() 13040 1726882402.31783: done with get_vars() 13040 1726882402.31786: variable 'playbook_dir' from source: magic vars 13040 1726882402.31787: variable 'ansible_playbook_python' from source: magic vars 13040 1726882402.31788: variable 'ansible_config_file' from source: magic vars 13040 1726882402.31789: variable 'groups' from source: magic vars 13040 1726882402.31789: variable 'omit' from source: magic vars 13040 1726882402.31790: variable 'ansible_version' from source: magic vars 13040 1726882402.31791: variable 'ansible_check_mode' from source: magic vars 13040 1726882402.31792: variable 'ansible_diff_mode' from source: magic vars 13040 1726882402.31792: variable 'ansible_forks' from source: magic vars 13040 1726882402.31793: variable 'ansible_inventory_sources' from source: magic vars 13040 1726882402.31794: variable 'ansible_skip_tags' from source: magic vars 13040 1726882402.31794: variable 'ansible_limit' from source: magic vars 13040 1726882402.31795: variable 'ansible_run_tags' from source: magic vars 13040 1726882402.31796: variable 'ansible_verbosity' from source: magic vars 13040 1726882402.31836: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml 13040 1726882402.32940: in VariableManager get_vars() 13040 1726882402.32956: done with get_vars() 13040 1726882402.32964: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 13040 1726882402.33574: in VariableManager get_vars() 13040 1726882402.33584: done with get_vars() 13040 1726882402.33590: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 13040 1726882402.33664: in VariableManager get_vars() 13040 1726882402.33675: done with get_vars() 13040 1726882402.33775: in VariableManager get_vars() 13040 1726882402.33784: done with get_vars() 13040 1726882402.33789: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 13040 1726882402.33835: in VariableManager get_vars() 13040 1726882402.33845: done with get_vars() 13040 1726882402.34034: in VariableManager get_vars() 13040 1726882402.34043: done with get_vars() 13040 1726882402.34046: variable 'omit' from source: magic vars 13040 1726882402.34060: variable 'omit' from source: magic vars 13040 1726882402.34082: in VariableManager get_vars() 13040 1726882402.34088: done with get_vars() 13040 1726882402.34119: in VariableManager get_vars() 13040 1726882402.34126: done with get_vars() 13040 1726882402.34154: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 13040 1726882402.34287: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 13040 1726882402.34367: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 13040 1726882402.35166: in VariableManager get_vars() 13040 1726882402.35188: done with get_vars() 13040 1726882402.35717: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 13040 1726882402.35813: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13040 1726882402.36846: in VariableManager get_vars() 13040 1726882402.36861: done with get_vars() 13040 1726882402.36870: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 13040 1726882402.38184: in VariableManager get_vars() 13040 1726882402.38200: done with get_vars() 13040 1726882402.38288: in VariableManager get_vars() 13040 1726882402.38301: done with get_vars() 13040 1726882402.38556: in VariableManager get_vars() 13040 1726882402.38575: done with get_vars() 13040 1726882402.38580: variable 'omit' from source: magic vars 13040 1726882402.38591: variable 'omit' from source: magic vars 13040 1726882402.38747: variable 'controller_profile' from source: play vars 13040 1726882402.38794: in VariableManager get_vars() 13040 1726882402.38808: done with get_vars() 13040 1726882402.38829: in VariableManager get_vars() 13040 1726882402.38844: done with get_vars() 13040 1726882402.38877: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 13040 1726882402.39002: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 13040 1726882402.39080: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 13040 1726882402.39467: in VariableManager get_vars() 13040 1726882402.39489: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13040 1726882402.41215: in VariableManager get_vars() 13040 1726882402.41231: done with get_vars() 13040 1726882402.41234: variable 'omit' from source: magic vars 13040 1726882402.41241: variable 'omit' from source: magic vars 13040 1726882402.41261: in VariableManager get_vars() 13040 1726882402.41277: done with get_vars() 13040 1726882402.41290: in VariableManager get_vars() 13040 1726882402.41301: done with get_vars() 13040 1726882402.41322: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 13040 1726882402.41398: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 13040 1726882402.41446: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 13040 1726882402.41710: in VariableManager get_vars() 13040 1726882402.41731: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13040 1726882402.43535: in VariableManager get_vars() 13040 1726882402.43559: done with get_vars() 13040 1726882402.43566: variable 'omit' from source: magic vars 13040 1726882402.43577: variable 'omit' from source: magic vars 13040 1726882402.43608: in VariableManager get_vars() 13040 1726882402.43626: done with get_vars() 13040 1726882402.43644: in VariableManager get_vars() 13040 1726882402.43664: done with get_vars() 13040 1726882402.43693: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 13040 1726882402.43810: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 13040 1726882402.43891: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 13040 1726882402.44289: in VariableManager get_vars() 13040 1726882402.44314: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13040 1726882402.46397: in VariableManager get_vars() 13040 1726882402.46419: done with get_vars() 13040 1726882402.46423: variable 'omit' from source: magic vars 13040 1726882402.46444: variable 'omit' from source: magic vars 13040 1726882402.46477: in VariableManager get_vars() 13040 1726882402.46496: done with get_vars() 13040 1726882402.46517: in VariableManager get_vars() 13040 1726882402.46558: done with get_vars() 13040 1726882402.46590: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 13040 1726882402.46706: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 13040 1726882402.46786: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 13040 1726882402.47188: in VariableManager get_vars() 13040 1726882402.47217: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13040 1726882402.49247: in VariableManager get_vars() 13040 1726882402.49281: done with get_vars() 13040 1726882402.49291: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 13040 1726882402.50062: in VariableManager get_vars() 13040 1726882402.50094: done with get_vars() 13040 1726882402.50327: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 13040 1726882402.50342: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 13040 1726882402.50598: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 13040 1726882402.50885: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 13040 1726882402.50887: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Xyq/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 13040 1726882402.50909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 13040 1726882402.50925: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 13040 1726882402.51062: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 13040 1726882402.51100: Loaded config def from plugin (callback/default) 13040 1726882402.51102: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 13040 1726882402.51927: Loaded config def from plugin (callback/junit) 13040 1726882402.51929: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 13040 1726882402.51997: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 13040 1726882402.52084: Loaded config def from plugin (callback/minimal) 13040 1726882402.52086: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 13040 1726882402.52155: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 13040 1726882402.52226: Loaded config def from plugin (callback/tree) 13040 1726882402.52229: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 13040 1726882402.52365: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 13040 1726882402.52368: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Xyq/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bond_removal_initscripts.yml *********************************** 2 plays in /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_initscripts.yml 13040 1726882402.52404: in VariableManager get_vars() 13040 1726882402.52419: done with get_vars() 13040 1726882402.52425: in VariableManager get_vars() 13040 1726882402.52437: done with get_vars() 13040 1726882402.52441: variable 'omit' from source: magic vars 13040 1726882402.52485: in VariableManager get_vars() 13040 1726882402.52507: done with get_vars() 13040 1726882402.52530: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bond_removal.yml' with initscripts as provider] *** 13040 1726882402.53871: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 13040 1726882402.53956: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 13040 1726882402.53991: getting the remaining hosts for this loop 13040 1726882402.53992: done getting the remaining hosts for this loop 13040 1726882402.53996: getting the next task for host managed_node1 13040 1726882402.54000: done getting next task for host managed_node1 13040 1726882402.54002: ^ task is: TASK: Gathering Facts 13040 1726882402.54003: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882402.54006: getting variables 13040 1726882402.54007: in VariableManager get_vars() 13040 1726882402.54018: Calling all_inventory to load vars for managed_node1 13040 1726882402.54021: Calling groups_inventory to load vars for managed_node1 13040 1726882402.54024: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882402.54036: Calling all_plugins_play to load vars for managed_node1 13040 1726882402.54047: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882402.54050: Calling groups_plugins_play to load vars for managed_node1 13040 1726882402.54087: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882402.54142: done with get_vars() 13040 1726882402.54149: done getting variables 13040 1726882402.54215: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_initscripts.yml:5 Friday 20 September 2024 21:33:22 -0400 (0:00:00.019) 0:00:00.019 ****** 13040 1726882402.54261: entering _queue_task() for managed_node1/gather_facts 13040 1726882402.54262: Creating lock for gather_facts 13040 1726882402.54599: worker is 1 (out of 1 available) 13040 1726882402.54608: exiting _queue_task() for managed_node1/gather_facts 13040 1726882402.54621: done queuing things up, now waiting for results queue to drain 13040 1726882402.54623: waiting for pending results... 13040 1726882402.54873: running TaskExecutor() for managed_node1/TASK: Gathering Facts 13040 1726882402.54978: in run() - task 0e448fcc-3ce9-b123-314b-0000000001bc 13040 1726882402.54999: variable 'ansible_search_path' from source: unknown 13040 1726882402.55038: calling self._execute() 13040 1726882402.55106: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882402.55116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882402.55127: variable 'omit' from source: magic vars 13040 1726882402.55231: variable 'omit' from source: magic vars 13040 1726882402.55262: variable 'omit' from source: magic vars 13040 1726882402.55307: variable 'omit' from source: magic vars 13040 1726882402.55353: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13040 1726882402.55398: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13040 1726882402.55420: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13040 1726882402.55440: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13040 1726882402.55455: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13040 1726882402.55489: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13040 1726882402.55496: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882402.55507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882402.55609: Set connection var ansible_shell_executable to /bin/sh 13040 1726882402.55622: Set connection var ansible_timeout to 10 13040 1726882402.55633: Set connection var ansible_pipelining to False 13040 1726882402.55643: Set connection var ansible_shell_type to sh 13040 1726882402.55649: Set connection var ansible_connection to ssh 13040 1726882402.55657: Set connection var ansible_module_compression to ZIP_DEFLATED 13040 1726882402.55707: variable 'ansible_shell_executable' from source: unknown 13040 1726882402.55714: variable 'ansible_connection' from source: unknown 13040 1726882402.55724: variable 'ansible_module_compression' from source: unknown 13040 1726882402.55731: variable 'ansible_shell_type' from source: unknown 13040 1726882402.55737: variable 'ansible_shell_executable' from source: unknown 13040 1726882402.55743: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882402.55750: variable 'ansible_pipelining' from source: unknown 13040 1726882402.55758: variable 'ansible_timeout' from source: unknown 13040 1726882402.55771: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882402.56408: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13040 1726882402.56424: variable 'omit' from source: magic vars 13040 1726882402.56433: starting attempt loop 13040 1726882402.56438: running the handler 13040 1726882402.56460: variable 'ansible_facts' from source: unknown 13040 1726882402.56484: _low_level_execute_command(): starting 13040 1726882402.56497: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13040 1726882402.57756: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13040 1726882402.57782: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13040 1726882402.57798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13040 1726882402.57821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13040 1726882402.57867: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 13040 1726882402.57882: stderr chunk (state=3): >>>debug2: match not found <<< 13040 1726882402.57896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13040 1726882402.57917: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13040 1726882402.57930: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 13040 1726882402.57942: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13040 1726882402.57955: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13040 1726882402.57972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13040 1726882402.57989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13040 1726882402.58000: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 13040 1726882402.58011: stderr chunk (state=3): >>>debug2: match found <<< 13040 1726882402.58028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13040 1726882402.58105: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13040 1726882402.58132: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13040 1726882402.58152: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13040 1726882402.58358: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13040 1726882402.60025: stdout chunk (state=3): >>>/root <<< 13040 1726882402.60220: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13040 1726882402.60224: stdout chunk (state=3): >>><<< 13040 1726882402.60226: stderr chunk (state=3): >>><<< 13040 1726882402.60347: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13040 1726882402.60350: _low_level_execute_command(): starting 13040 1726882402.60353: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882402.602458-13070-245196821940710 `" && echo ansible-tmp-1726882402.602458-13070-245196821940710="` echo /root/.ansible/tmp/ansible-tmp-1726882402.602458-13070-245196821940710 `" ) && sleep 0' 13040 1726882402.61031: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13040 1726882402.61055: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13040 1726882402.61100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13040 1726882402.61123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13040 1726882402.61169: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 13040 1726882402.61183: stderr chunk (state=3): >>>debug2: match not found <<< 13040 1726882402.61195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13040 1726882402.61216: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13040 1726882402.61230: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 13040 1726882402.61240: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13040 1726882402.61250: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13040 1726882402.61269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13040 1726882402.61284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13040 1726882402.61299: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 13040 1726882402.61309: stderr chunk (state=3): >>>debug2: match found <<< 13040 1726882402.61321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13040 1726882402.61410: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13040 1726882402.61436: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13040 1726882402.61458: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13040 1726882402.61679: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13040 1726882402.63450: stdout chunk (state=3): >>>ansible-tmp-1726882402.602458-13070-245196821940710=/root/.ansible/tmp/ansible-tmp-1726882402.602458-13070-245196821940710 <<< 13040 1726882402.63570: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13040 1726882402.63687: stderr chunk (state=3): >>><<< 13040 1726882402.63691: stdout chunk (state=3): >>><<< 13040 1726882402.63783: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882402.602458-13070-245196821940710=/root/.ansible/tmp/ansible-tmp-1726882402.602458-13070-245196821940710 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13040 1726882402.63787: variable 'ansible_module_compression' from source: unknown 13040 1726882402.63950: ANSIBALLZ: Using generic lock for ansible.legacy.setup 13040 1726882402.63953: ANSIBALLZ: Acquiring lock 13040 1726882402.63955: ANSIBALLZ: Lock acquired: 139648575893200 13040 1726882402.63958: ANSIBALLZ: Creating module 13040 1726882402.87873: ANSIBALLZ: Writing module into payload 13040 1726882402.87985: ANSIBALLZ: Writing module 13040 1726882402.88008: ANSIBALLZ: Renaming module 13040 1726882402.88014: ANSIBALLZ: Done creating module 13040 1726882402.88043: variable 'ansible_facts' from source: unknown 13040 1726882402.88054: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13040 1726882402.88059: _low_level_execute_command(): starting 13040 1726882402.88066: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 13040 1726882402.88735: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13040 1726882402.88744: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13040 1726882402.88757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13040 1726882402.88770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13040 1726882402.88808: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 13040 1726882402.88817: stderr chunk (state=3): >>>debug2: match not found <<< 13040 1726882402.88832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13040 1726882402.88846: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13040 1726882402.88856: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 13040 1726882402.88859: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13040 1726882402.88869: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13040 1726882402.88878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13040 1726882402.88889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13040 1726882402.88896: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 13040 1726882402.88902: stderr chunk (state=3): >>>debug2: match found <<< 13040 1726882402.88911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13040 1726882402.88991: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13040 1726882402.89006: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13040 1726882402.89009: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13040 1726882402.89145: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13040 1726882402.90828: stdout chunk (state=3): >>>PLATFORM <<< 13040 1726882402.90904: stdout chunk (state=3): >>>Linux <<< 13040 1726882402.90921: stdout chunk (state=3): >>>FOUND <<< 13040 1726882402.90924: stdout chunk (state=3): >>>/usr/bin/python3.9 <<< 13040 1726882402.90926: stdout chunk (state=3): >>>/usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 13040 1726882402.91062: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13040 1726882402.91124: stderr chunk (state=3): >>><<< 13040 1726882402.91162: stdout chunk (state=3): >>><<< 13040 1726882402.91168: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13040 1726882402.91179 [managed_node1]: found interpreters: ['/usr/bin/python3.9', '/usr/bin/python3', '/usr/bin/python3'] 13040 1726882402.91217: _low_level_execute_command(): starting 13040 1726882402.91221: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 && sleep 0' 13040 1726882402.91382: Sending initial data 13040 1726882402.91385: Sent initial data (1181 bytes) 13040 1726882402.91982: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13040 1726882402.91985: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13040 1726882402.91988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13040 1726882402.91990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13040 1726882402.91992: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 13040 1726882402.91994: stderr chunk (state=3): >>>debug2: match not found <<< 13040 1726882402.91997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13040 1726882402.91999: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13040 1726882402.92001: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 13040 1726882402.92003: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13040 1726882402.92009: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13040 1726882402.92017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13040 1726882402.92027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13040 1726882402.92035: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 13040 1726882402.92040: stderr chunk (state=3): >>>debug2: match found <<< 13040 1726882402.92054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13040 1726882402.92122: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13040 1726882402.92136: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13040 1726882402.92145: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13040 1726882402.92438: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13040 1726882402.96214: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 13040 1726882402.96575: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13040 1726882402.96630: stderr chunk (state=3): >>><<< 13040 1726882402.96634: stdout chunk (state=3): >>><<< 13040 1726882402.96644: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13040 1726882402.96701: variable 'ansible_facts' from source: unknown 13040 1726882402.96704: variable 'ansible_facts' from source: unknown 13040 1726882402.96712: variable 'ansible_module_compression' from source: unknown 13040 1726882402.96746: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1304074tzu_9c/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 13040 1726882402.96772: variable 'ansible_facts' from source: unknown 13040 1726882402.96877: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882402.602458-13070-245196821940710/AnsiballZ_setup.py 13040 1726882402.96999: Sending initial data 13040 1726882402.97003: Sent initial data (153 bytes) 13040 1726882402.97708: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13040 1726882402.97712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13040 1726882402.97742: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 13040 1726882402.97748: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13040 1726882402.97759: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13040 1726882402.97779: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 13040 1726882402.97804: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 13040 1726882402.97807: stderr chunk (state=3): >>>debug2: match found <<< 13040 1726882402.97816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13040 1726882402.97872: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13040 1726882402.97893: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13040 1726882402.97896: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13040 1726882402.97993: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13040 1726882402.99727: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13040 1726882402.99813: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13040 1726882402.99901: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1304074tzu_9c/tmp1_2xjz5b /root/.ansible/tmp/ansible-tmp-1726882402.602458-13070-245196821940710/AnsiballZ_setup.py <<< 13040 1726882402.99988: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13040 1726882403.02512: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13040 1726882403.02765: stderr chunk (state=3): >>><<< 13040 1726882403.02769: stdout chunk (state=3): >>><<< 13040 1726882403.02771: done transferring module to remote 13040 1726882403.02773: _low_level_execute_command(): starting 13040 1726882403.02775: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882402.602458-13070-245196821940710/ /root/.ansible/tmp/ansible-tmp-1726882402.602458-13070-245196821940710/AnsiballZ_setup.py && sleep 0' 13040 1726882403.03709: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13040 1726882403.03713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13040 1726882403.03744: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13040 1726882403.03747: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13040 1726882403.03750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13040 1726882403.03816: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13040 1726882403.03936: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13040 1726882403.04057: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13040 1726882403.05843: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13040 1726882403.05846: stdout chunk (state=3): >>><<< 13040 1726882403.05849: stderr chunk (state=3): >>><<< 13040 1726882403.05955: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13040 1726882403.05958: _low_level_execute_command(): starting 13040 1726882403.05961: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882402.602458-13070-245196821940710/AnsiballZ_setup.py && sleep 0' 13040 1726882403.06666: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13040 1726882403.06681: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13040 1726882403.06699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13040 1726882403.06726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13040 1726882403.06769: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 13040 1726882403.06785: stderr chunk (state=3): >>>debug2: match not found <<< 13040 1726882403.06799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13040 1726882403.06824: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13040 1726882403.06844: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 13040 1726882403.06858: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13040 1726882403.06874: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13040 1726882403.06888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13040 1726882403.06908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13040 1726882403.06920: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 13040 1726882403.06935: stderr chunk (state=3): >>>debug2: match found <<< 13040 1726882403.06953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13040 1726882403.07032: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13040 1726882403.07063: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13040 1726882403.07085: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13040 1726882403.07216: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13040 1726882403.09189: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 13040 1726882403.09194: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 13040 1726882403.09230: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 13040 1726882403.09272: stdout chunk (state=3): >>>import 'posix' # <<< 13040 1726882403.09301: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 13040 1726882403.09328: stdout chunk (state=3): >>>import 'time' # <<< 13040 1726882403.09346: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 13040 1726882403.09399: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 13040 1726882403.09426: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 13040 1726882403.09441: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' <<< 13040 1726882403.09448: stdout chunk (state=3): >>>import '_codecs' # <<< 13040 1726882403.09474: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93e98dc0> <<< 13040 1726882403.09517: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 13040 1726882403.09521: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' <<< 13040 1726882403.09525: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93e3d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93e98b20> <<< 13040 1726882403.09554: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 13040 1726882403.09586: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93e98ac0> <<< 13040 1726882403.09589: stdout chunk (state=3): >>>import '_signal' # <<< 13040 1726882403.09625: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 13040 1726882403.09628: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93e3d490> <<< 13040 1726882403.09661: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 13040 1726882403.09689: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 13040 1726882403.09705: stdout chunk (state=3): >>>import '_abc' # <<< 13040 1726882403.09721: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93e3d940> <<< 13040 1726882403.09723: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93e3d670> <<< 13040 1726882403.09757: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 13040 1726882403.09760: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 13040 1726882403.09786: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 13040 1726882403.09797: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 13040 1726882403.09821: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 13040 1726882403.09832: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 13040 1726882403.09867: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93bcf190> <<< 13040 1726882403.09885: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 13040 1726882403.09897: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 13040 1726882403.09973: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93bcf220> <<< 13040 1726882403.10014: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py <<< 13040 1726882403.10027: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 13040 1726882403.10039: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93bf2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93bcf940> <<< 13040 1726882403.10067: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93e55880> <<< 13040 1726882403.10094: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' <<< 13040 1726882403.10108: stdout chunk (state=3): >>>import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93bc7d90> <<< 13040 1726882403.10160: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 13040 1726882403.10167: stdout chunk (state=3): >>>import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93bf2d90> <<< 13040 1726882403.10215: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93e3d970> <<< 13040 1726882403.10247: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 13040 1726882403.10577: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 13040 1726882403.10584: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 13040 1726882403.10612: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py <<< 13040 1726882403.10633: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 13040 1726882403.10636: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 13040 1726882403.10694: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 13040 1726882403.10698: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 13040 1726882403.10700: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93b6eeb0> <<< 13040 1726882403.10741: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93b71f40> <<< 13040 1726882403.10785: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 13040 1726882403.10789: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 13040 1726882403.10818: stdout chunk (state=3): >>>import '_sre' # <<< 13040 1726882403.10821: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 13040 1726882403.10823: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 13040 1726882403.10849: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 13040 1726882403.11008: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93b67610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93b6d640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93b6e370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 13040 1726882403.11011: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 13040 1726882403.11058: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 13040 1726882403.11106: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py <<< 13040 1726882403.11134: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 13040 1726882403.11137: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' <<< 13040 1726882403.11156: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f93a54dc0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93a548b0> <<< 13040 1726882403.11176: stdout chunk (state=3): >>>import 'itertools' # <<< 13040 1726882403.11198: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93a54eb0> <<< 13040 1726882403.11245: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 13040 1726882403.11249: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 13040 1726882403.11252: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93a54f70> <<< 13040 1726882403.11254: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' <<< 13040 1726882403.11257: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93a54e80> <<< 13040 1726882403.11259: stdout chunk (state=3): >>>import '_collections' # <<< 13040 1726882403.11295: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93b49d30> import '_functools' # <<< 13040 1726882403.11325: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93b42610> <<< 13040 1726882403.11384: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93b55670> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93b75e20> <<< 13040 1726882403.11409: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 13040 1726882403.11443: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f93a66c70> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93b49250> <<< 13040 1726882403.11487: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f93b55280> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93b7b9d0> <<< 13040 1726882403.11519: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 13040 1726882403.11542: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 13040 1726882403.11593: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 13040 1726882403.11598: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 13040 1726882403.11630: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93a66fa0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93a66d90> <<< 13040 1726882403.11690: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' <<< 13040 1726882403.11695: stdout chunk (state=3): >>>import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93a66d00> <<< 13040 1726882403.11697: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py <<< 13040 1726882403.11699: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 13040 1726882403.11702: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 13040 1726882403.11704: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 13040 1726882403.11705: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 13040 1726882403.11759: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 13040 1726882403.11782: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93a39370> <<< 13040 1726882403.11803: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 13040 1726882403.11815: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 13040 1726882403.11847: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93a39460> <<< 13040 1726882403.11973: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93a6dfa0> <<< 13040 1726882403.12006: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93a68a30> <<< 13040 1726882403.12019: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93a68490> <<< 13040 1726882403.12039: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py <<< 13040 1726882403.12055: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 13040 1726882403.12083: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 13040 1726882403.12099: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 13040 1726882403.12116: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py <<< 13040 1726882403.12131: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f9396d1c0> <<< 13040 1726882403.12170: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93a24c70> <<< 13040 1726882403.12217: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93a68eb0> <<< 13040 1726882403.12224: stdout chunk (state=3): >>>import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93b7b040> <<< 13040 1726882403.12258: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 13040 1726882403.12272: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 13040 1726882403.12295: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' <<< 13040 1726882403.12305: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f9397faf0> <<< 13040 1726882403.12311: stdout chunk (state=3): >>>import 'errno' # <<< 13040 1726882403.12341: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' <<< 13040 1726882403.12356: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f9397fe20> <<< 13040 1726882403.12366: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 13040 1726882403.12399: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' <<< 13040 1726882403.12402: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93991730> <<< 13040 1726882403.12427: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 13040 1726882403.12458: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 13040 1726882403.12487: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93991c70> <<< 13040 1726882403.12534: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f939293a0> <<< 13040 1726882403.12542: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f9397ff10> <<< 13040 1726882403.12566: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py <<< 13040 1726882403.12569: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 13040 1726882403.12605: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f9393a280> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f939915b0> <<< 13040 1726882403.12612: stdout chunk (state=3): >>>import 'pwd' # <<< 13040 1726882403.12644: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f9393a340> <<< 13040 1726882403.12693: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93a669d0> <<< 13040 1726882403.12701: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 13040 1726882403.12723: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 13040 1726882403.12741: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 13040 1726882403.12748: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 13040 1726882403.12786: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f939556a0> <<< 13040 1726882403.12800: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 13040 1726882403.12829: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f93955970> <<< 13040 1726882403.12836: stdout chunk (state=3): >>>import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93955760> <<< 13040 1726882403.12867: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f93955850> <<< 13040 1726882403.12900: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 13040 1726882403.13095: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f93955ca0> <<< 13040 1726882403.13141: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f939621f0> <<< 13040 1726882403.13144: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f939558e0> <<< 13040 1726882403.13146: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93949a30> <<< 13040 1726882403.13177: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93a665b0> <<< 13040 1726882403.13195: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 13040 1726882403.13256: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 13040 1726882403.13284: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93955a90> <<< 13040 1726882403.13433: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 13040 1726882403.13436: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f3f93892670> <<< 13040 1726882403.13742: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 13040 1726882403.13836: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.13908: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py <<< 13040 1726882403.13914: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py <<< 13040 1726882403.13916: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.15113: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.16048: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f937267f0> <<< 13040 1726882403.16111: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 13040 1726882403.16115: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f937b7760> <<< 13040 1726882403.16142: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f937b7640> <<< 13040 1726882403.16203: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f937b7370> <<< 13040 1726882403.16207: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 13040 1726882403.16242: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f937b7490> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f937b7190> import 'atexit' # <<< 13040 1726882403.16286: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f937b7400> <<< 13040 1726882403.16308: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 13040 1726882403.16319: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 13040 1726882403.16363: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f937b77c0> <<< 13040 1726882403.16396: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 13040 1726882403.16420: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 13040 1726882403.16449: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 13040 1726882403.16455: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 13040 1726882403.16536: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f937907c0> <<< 13040 1726882403.16588: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f93790b50> <<< 13040 1726882403.16623: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f937909a0> <<< 13040 1726882403.16626: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 13040 1726882403.16674: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f931874f0> <<< 13040 1726882403.16681: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f937b0d30> <<< 13040 1726882403.16858: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f937b7520> <<< 13040 1726882403.16891: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f937b0190> <<< 13040 1726882403.16903: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 13040 1726882403.16954: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 13040 1726882403.16991: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 13040 1726882403.17004: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f937e1a90> <<< 13040 1726882403.17087: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93784190> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93784790> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f9318dd00> <<< 13040 1726882403.17128: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f937846a0> <<< 13040 1726882403.17148: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93805d30> <<< 13040 1726882403.17171: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 13040 1726882403.17194: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 13040 1726882403.17229: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 13040 1726882403.17328: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f931e59a0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93810e50> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 13040 1726882403.17331: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 13040 1726882403.17380: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f931f50d0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93810e20> <<< 13040 1726882403.17400: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 13040 1726882403.17431: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 13040 1726882403.17477: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' <<< 13040 1726882403.17480: stdout chunk (state=3): >>>import '_string' # <<< 13040 1726882403.17525: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93817220> <<< 13040 1726882403.17650: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f931f5100> <<< 13040 1726882403.17748: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f937dbb80> <<< 13040 1726882403.17775: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f93810ac0> <<< 13040 1726882403.17809: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' <<< 13040 1726882403.17831: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f93810d00> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93892820> <<< 13040 1726882403.17865: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 13040 1726882403.17878: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 13040 1726882403.17921: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f931f10d0> <<< 13040 1726882403.18102: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' <<< 13040 1726882403.18112: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f931e7370> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f931f1d00> <<< 13040 1726882403.18163: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f931f16a0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f931f2130> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py <<< 13040 1726882403.18191: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.18269: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.18371: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13040 1726882403.18378: stdout chunk (state=3): >>>import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py <<< 13040 1726882403.18380: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.18396: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py <<< 13040 1726882403.18399: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.18491: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.18593: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.19029: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.19506: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 13040 1726882403.19518: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 13040 1726882403.19582: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f9374f8b0> <<< 13040 1726882403.19657: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93754910> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f92d756a0> <<< 13040 1726882403.19704: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available <<< 13040 1726882403.19744: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.19763: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 13040 1726882403.19879: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.20014: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 13040 1726882403.20026: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f9378e7f0> # zipimport: zlib available <<< 13040 1726882403.20428: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.20780: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.20833: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.20900: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py <<< 13040 1726882403.20903: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.20927: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.20970: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available <<< 13040 1726882403.21023: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.21113: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py <<< 13040 1726882403.21128: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available <<< 13040 1726882403.21168: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.21208: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available <<< 13040 1726882403.21392: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.21585: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 13040 1726882403.21608: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 13040 1726882403.21623: stdout chunk (state=3): >>>import '_ast' # <<< 13040 1726882403.21690: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f92d7ad90> <<< 13040 1726882403.21693: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.21742: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.21832: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 13040 1726882403.21862: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.21876: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.21914: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available <<< 13040 1726882403.21962: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.21988: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.22088: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.22138: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 13040 1726882403.22178: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 13040 1726882403.22236: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f937420a0> <<< 13040 1726882403.22330: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f92d41070> <<< 13040 1726882403.22373: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available <<< 13040 1726882403.22423: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.22483: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.22499: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.22538: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 13040 1726882403.22544: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 13040 1726882403.22572: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 13040 1726882403.22597: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 13040 1726882403.22619: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 13040 1726882403.22640: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 13040 1726882403.22724: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f9374b160> <<< 13040 1726882403.22757: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93747cd0> <<< 13040 1726882403.22818: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f92d7abb0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py <<< 13040 1726882403.22823: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.22848: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.22872: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 13040 1726882403.22943: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py <<< 13040 1726882403.22982: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.22985: stdout chunk (state=3): >>># zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py <<< 13040 1726882403.22988: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.23035: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.23102: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13040 1726882403.23129: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.23159: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.23201: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.23225: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.23260: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py <<< 13040 1726882403.23267: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.23341: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.23401: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.23423: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.23450: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py <<< 13040 1726882403.23456: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.23606: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.23747: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.23774: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.23839: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py <<< 13040 1726882403.23842: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 13040 1726882403.23845: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 13040 1726882403.23902: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 13040 1726882403.23927: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f92af5a60> <<< 13040 1726882403.23931: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py <<< 13040 1726882403.23933: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 13040 1726882403.23981: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 13040 1726882403.23991: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 13040 1726882403.24013: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py <<< 13040 1726882403.24016: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' <<< 13040 1726882403.24031: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f92d546d0> <<< 13040 1726882403.24068: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f92d54af0> <<< 13040 1726882403.24133: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f92d3b250> <<< 13040 1726882403.24139: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f92d3ba30> <<< 13040 1726882403.24183: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f92d8a460> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f92d8a910> <<< 13040 1726882403.24193: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 13040 1726882403.24226: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' <<< 13040 1726882403.24229: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 13040 1726882403.24280: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f92d87d00> <<< 13040 1726882403.24289: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f92d87d60> <<< 13040 1726882403.24292: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py <<< 13040 1726882403.24313: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 13040 1726882403.24330: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f92d87250> <<< 13040 1726882403.24358: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 13040 1726882403.24373: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 13040 1726882403.24409: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f92b5df70> <<< 13040 1726882403.24457: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f92d9b4c0> <<< 13040 1726882403.24460: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f92d8a310> <<< 13040 1726882403.24478: stdout chunk (state=3): >>>import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py <<< 13040 1726882403.24494: stdout chunk (state=3): >>>import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py <<< 13040 1726882403.24501: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13040 1726882403.24504: stdout chunk (state=3): >>>import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py <<< 13040 1726882403.24506: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.24572: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.24614: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py <<< 13040 1726882403.24620: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.24666: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.24721: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available <<< 13040 1726882403.24724: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py <<< 13040 1726882403.24747: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.24770: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.24799: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py <<< 13040 1726882403.24805: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.24854: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.24891: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py <<< 13040 1726882403.24897: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.24934: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.24974: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py <<< 13040 1726882403.24980: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.25033: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.25083: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.25131: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.25178: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py <<< 13040 1726882403.25192: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available <<< 13040 1726882403.25585: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.25945: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py <<< 13040 1726882403.25953: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.25985: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.26042: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.26067: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.26098: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py <<< 13040 1726882403.26104: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.26130: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.26158: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py <<< 13040 1726882403.26167: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.26216: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.26264: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py <<< 13040 1726882403.26271: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.26294: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.26322: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available <<< 13040 1726882403.26356: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.26392: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available <<< 13040 1726882403.26447: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.26547: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 13040 1726882403.26570: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f92a6cca0> <<< 13040 1726882403.26590: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 13040 1726882403.26595: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 13040 1726882403.26757: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f92a6cfd0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available <<< 13040 1726882403.26809: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.26873: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available <<< 13040 1726882403.26945: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.27026: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available <<< 13040 1726882403.27090: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.27168: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py <<< 13040 1726882403.27174: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.27195: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.27253: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 13040 1726882403.27257: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 13040 1726882403.27401: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f92a69370> <<< 13040 1726882403.27649: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f92ab8bb0> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py <<< 13040 1726882403.27655: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.27689: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.27744: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available <<< 13040 1726882403.27823: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.27885: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.27986: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.28115: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available <<< 13040 1726882403.28158: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.28198: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py <<< 13040 1726882403.28201: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.28226: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.28377: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f929f0160> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f929f02b0> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available <<< 13040 1726882403.28395: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py <<< 13040 1726882403.28398: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.28421: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.28477: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available <<< 13040 1726882403.28610: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.28726: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available <<< 13040 1726882403.28803: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.28887: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.28918: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.28961: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py <<< 13040 1726882403.28977: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.29053: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.29070: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.29182: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.29309: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py <<< 13040 1726882403.29313: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.29410: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.29817: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 13040 1726882403.30017: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.30418: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py <<< 13040 1726882403.30431: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available <<< 13040 1726882403.30526: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.30616: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py <<< 13040 1726882403.30619: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.30690: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.30780: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available <<< 13040 1726882403.30908: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.31069: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py <<< 13040 1726882403.31072: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.31074: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.31076: stdout chunk (state=3): >>>import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py <<< 13040 1726882403.31092: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.31110: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.31160: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py <<< 13040 1726882403.31163: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.31250: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.31317: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.31483: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.31655: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 13040 1726882403.31690: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.31693: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.31729: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available <<< 13040 1726882403.31761: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.31789: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available <<< 13040 1726882403.31841: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.31913: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available <<< 13040 1726882403.31940: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.31967: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available <<< 13040 1726882403.32015: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.32066: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available <<< 13040 1726882403.32118: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.32178: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py <<< 13040 1726882403.32181: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.32383: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.32603: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py <<< 13040 1726882403.32607: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.32655: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.32726: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available <<< 13040 1726882403.32739: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.32783: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available <<< 13040 1726882403.32808: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.32840: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py <<< 13040 1726882403.32899: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13040 1726882403.32922: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available <<< 13040 1726882403.32984: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.33450: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available <<< 13040 1726882403.33482: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.33530: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available <<< 13040 1726882403.33975: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available <<< 13040 1726882403.34025: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available <<< 13040 1726882403.34093: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.34175: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py <<< 13040 1726882403.34179: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.34261: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.34319: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 13040 1726882403.34416: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882403.35177: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 13040 1726882403.35199: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py <<< 13040 1726882403.35240: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 13040 1726882403.35244: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f92a40760> <<< 13040 1726882403.35257: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f92a40df0> <<< 13040 1726882403.35312: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f92859850> <<< 13040 1726882403.36511: stdout chunk (state=3): >>>import 'gc' # <<< 13040 1726882403.41775: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py <<< 13040 1726882403.41779: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' <<< 13040 1726882403.41799: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f92a403a0> <<< 13040 1726882403.41817: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py <<< 13040 1726882403.41834: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' <<< 13040 1726882403.41845: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f929f0eb0> <<< 13040 1726882403.41907: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' <<< 13040 1726882403.41947: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' <<< 13040 1726882403.41955: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f9284f310> <<< 13040 1726882403.41968: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f929eb700> <<< 13040 1726882403.42272: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame <<< 13040 1726882403.42275: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame<<< 13040 1726882403.42277: stdout chunk (state=3): >>> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 13040 1726882403.63217: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "33", "second": "23", "epoch": "1726882403", "epoch_int": "1726882403", "date": "2024-09-20", "time": "21:33:23", "iso8601_micro": "2024-09-21T01:33:23.350225Z", "iso8601": "2024-09-21T01:33:23Z", "iso8601_basic": "20240920T213323350225", "iso8601_basic_short": "20240920T213323", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_loadavg": {"1m": 0.63, "5m": 0.37, "15m": 0.18}, "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw<<< 13040 1726882403.63228: stdout chunk (state=3): >>>_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_process<<< 13040 1726882403.63245: stdout chunk (state=3): >>>or_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2807, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 725, "free": 2807}, "nocache": {"free": 3268, "used": 264}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 561, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264241577984, "block_size": 4096, "block_total": 65519355, "block_available": 64512104, "block_used": 1007251, "inode_total": 131071472, "inode_available": 130998699, "inode_used": 72773, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_service_mgr": "systemd", "ansible_fips": false, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 13040 1726882403.63740: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value<<< 13040 1726882403.63744: stdout chunk (state=3): >>> # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache <<< 13040 1726882403.63747: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 <<< 13040 1726882403.63751: stdout chunk (state=3): >>># cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib <<< 13040 1726882403.63789: stdout chunk (state=3): >>># cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil <<< 13040 1726882403.63812: stdout chunk (state=3): >>># destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json <<< 13040 1726882403.63816: stdout chunk (state=3): >>># cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string <<< 13040 1726882403.63818: stdout chunk (state=3): >>># cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common <<< 13040 1726882403.63839: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux <<< 13040 1726882403.63857: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context <<< 13040 1726882403.63919: stdout chunk (state=3): >>># cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version <<< 13040 1726882403.63923: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin <<< 13040 1726882403.63925: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata <<< 13040 1726882403.63931: stdout chunk (state=3): >>># cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 13040 1726882403.64189: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 13040 1726882403.64205: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 13040 1726882403.64240: stdout chunk (state=3): >>># destroy zipimport <<< 13040 1726882403.64246: stdout chunk (state=3): >>># destroy _compression <<< 13040 1726882403.64262: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 13040 1726882403.64295: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 13040 1726882403.64298: stdout chunk (state=3): >>># destroy _json # destroy encodings <<< 13040 1726882403.64320: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 13040 1726882403.64366: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy logging # destroy argparse <<< 13040 1726882403.64424: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector <<< 13040 1726882403.64429: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle <<< 13040 1726882403.64432: stdout chunk (state=3): >>># destroy _compat_pickle <<< 13040 1726882403.64449: stdout chunk (state=3): >>># destroy queue # destroy multiprocessing.reduction <<< 13040 1726882403.64473: stdout chunk (state=3): >>># destroy shlex <<< 13040 1726882403.64491: stdout chunk (state=3): >>># destroy datetime # destroy base64 <<< 13040 1726882403.64507: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass <<< 13040 1726882403.64524: stdout chunk (state=3): >>># destroy json <<< 13040 1726882403.64557: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing <<< 13040 1726882403.64575: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection <<< 13040 1726882403.64616: stdout chunk (state=3): >>># cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios <<< 13040 1726882403.64649: stdout chunk (state=3): >>># cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc <<< 13040 1726882403.64676: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize <<< 13040 1726882403.64703: stdout chunk (state=3): >>># cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl <<< 13040 1726882403.64726: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma<<< 13040 1726882403.64753: stdout chunk (state=3): >>> # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings <<< 13040 1726882403.64780: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile <<< 13040 1726882403.64808: stdout chunk (state=3): >>># destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse <<< 13040 1726882403.64832: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path <<< 13040 1726882403.64858: stdout chunk (state=3): >>># destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 <<< 13040 1726882403.64885: stdout chunk (state=3): >>># cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib <<< 13040 1726882403.64898: stdout chunk (state=3): >>># cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 13040 1726882403.64926: stdout chunk (state=3): >>># destroy gc # destroy unicodedata # destroy termios <<< 13040 1726882403.64945: stdout chunk (state=3): >>># destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 13040 1726882403.65105: stdout chunk (state=3): >>># destroy platform <<< 13040 1726882403.65128: stdout chunk (state=3): >>># destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize <<< 13040 1726882403.65149: stdout chunk (state=3): >>># destroy _heapq # destroy posixpath # destroy stat <<< 13040 1726882403.65169: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 13040 1726882403.65196: stdout chunk (state=3): >>># destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser <<< 13040 1726882403.65210: stdout chunk (state=3): >>># destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 13040 1726882403.65240: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 13040 1726882403.65532: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 13040 1726882403.65634: stderr chunk (state=3): >>><<< 13040 1726882403.65637: stdout chunk (state=3): >>><<< 13040 1726882403.65903: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93e98dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93e3d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93e98b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93e98ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93e3d490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93e3d940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93e3d670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93bcf190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93bcf220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93bf2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93bcf940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93e55880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93bc7d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93bf2d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93e3d970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93b6eeb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93b71f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93b67610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93b6d640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93b6e370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f93a54dc0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93a548b0> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93a54eb0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93a54f70> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93a54e80> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93b49d30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93b42610> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93b55670> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93b75e20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f93a66c70> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93b49250> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f93b55280> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93b7b9d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93a66fa0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93a66d90> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93a66d00> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93a39370> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93a39460> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93a6dfa0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93a68a30> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93a68490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f9396d1c0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93a24c70> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93a68eb0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93b7b040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f9397faf0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f9397fe20> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93991730> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93991c70> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f939293a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f9397ff10> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f9393a280> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f939915b0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f9393a340> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93a669d0> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f939556a0> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f93955970> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93955760> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f93955850> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f93955ca0> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f939621f0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f939558e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93949a30> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93a665b0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93955a90> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f3f93892670> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f937267f0> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f937b7760> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f937b7640> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f937b7370> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f937b7490> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f937b7190> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f937b7400> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f937b77c0> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f937907c0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f93790b50> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f937909a0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f931874f0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f937b0d30> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f937b7520> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f937b0190> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f937e1a90> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93784190> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93784790> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f9318dd00> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f937846a0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93805d30> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f931e59a0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93810e50> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f931f50d0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93810e20> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93817220> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f931f5100> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f937dbb80> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f93810ac0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f93810d00> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93892820> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f931f10d0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f931e7370> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f931f1d00> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f931f16a0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f931f2130> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f9374f8b0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93754910> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f92d756a0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f9378e7f0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f92d7ad90> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f937420a0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f92d41070> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f9374b160> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f93747cd0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f92d7abb0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f92af5a60> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f92d546d0> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f92d54af0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f92d3b250> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f92d3ba30> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f92d8a460> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f92d8a910> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f92d87d00> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f92d87d60> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f92d87250> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f92b5df70> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f92d9b4c0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f92d8a310> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f92a6cca0> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f92a6cfd0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f92a69370> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f92ab8bb0> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f929f0160> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f929f02b0> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ywwklzf8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3f92a40760> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f92a40df0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f92859850> import 'gc' # # /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f92a403a0> # /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f929f0eb0> # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f9284f310> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3f929eb700> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "33", "second": "23", "epoch": "1726882403", "epoch_int": "1726882403", "date": "2024-09-20", "time": "21:33:23", "iso8601_micro": "2024-09-21T01:33:23.350225Z", "iso8601": "2024-09-21T01:33:23Z", "iso8601_basic": "20240920T213323350225", "iso8601_basic_short": "20240920T213323", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_loadavg": {"1m": 0.63, "5m": 0.37, "15m": 0.18}, "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2807, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 725, "free": 2807}, "nocache": {"free": 3268, "used": 264}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 561, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264241577984, "block_size": 4096, "block_total": 65519355, "block_available": 64512104, "block_used": 1007251, "inode_total": 131071472, "inode_available": 130998699, "inode_used": 72773, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_service_mgr": "systemd", "ansible_fips": false, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks [WARNING]: Platform linux on host managed_node1 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 13040 1726882403.67473: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882402.602458-13070-245196821940710/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13040 1726882403.67477: _low_level_execute_command(): starting 13040 1726882403.67482: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882402.602458-13070-245196821940710/ > /dev/null 2>&1 && sleep 0' 13040 1726882403.68179: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13040 1726882403.68183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13040 1726882403.68216: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13040 1726882403.68220: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13040 1726882403.68222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13040 1726882403.68293: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13040 1726882403.68297: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13040 1726882403.68314: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13040 1726882403.68407: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13040 1726882403.70286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13040 1726882403.70385: stderr chunk (state=3): >>><<< 13040 1726882403.70415: stdout chunk (state=3): >>><<< 13040 1726882403.70681: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13040 1726882403.70686: handler run complete 13040 1726882403.70689: variable 'ansible_facts' from source: unknown 13040 1726882403.70712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882403.71321: variable 'ansible_facts' from source: unknown 13040 1726882403.71392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882403.71532: attempt loop complete, returning result 13040 1726882403.71536: _execute() done 13040 1726882403.71540: dumping result to json 13040 1726882403.71569: done dumping result, returning 13040 1726882403.71577: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-b123-314b-0000000001bc] 13040 1726882403.71582: sending task result for task 0e448fcc-3ce9-b123-314b-0000000001bc ok: [managed_node1] 13040 1726882403.73024: no more pending results, returning what we have 13040 1726882403.73027: results queue empty 13040 1726882403.73028: checking for any_errors_fatal 13040 1726882403.73029: done checking for any_errors_fatal 13040 1726882403.73030: checking for max_fail_percentage 13040 1726882403.73032: done checking for max_fail_percentage 13040 1726882403.73032: checking to see if all hosts have failed and the running result is not ok 13040 1726882403.73033: done checking to see if all hosts have failed 13040 1726882403.73034: getting the remaining hosts for this loop 13040 1726882403.73035: done getting the remaining hosts for this loop 13040 1726882403.73039: getting the next task for host managed_node1 13040 1726882403.73045: done getting next task for host managed_node1 13040 1726882403.73047: ^ task is: TASK: meta (flush_handlers) 13040 1726882403.73049: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882403.73055: getting variables 13040 1726882403.73056: in VariableManager get_vars() 13040 1726882403.73080: Calling all_inventory to load vars for managed_node1 13040 1726882403.73088: Calling groups_inventory to load vars for managed_node1 13040 1726882403.73091: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882403.73101: Calling all_plugins_play to load vars for managed_node1 13040 1726882403.73102: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882403.73105: Calling groups_plugins_play to load vars for managed_node1 13040 1726882403.73223: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000001bc 13040 1726882403.73227: WORKER PROCESS EXITING 13040 1726882403.73236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882403.73354: done with get_vars() 13040 1726882403.73362: done getting variables 13040 1726882403.73414: in VariableManager get_vars() 13040 1726882403.73421: Calling all_inventory to load vars for managed_node1 13040 1726882403.73422: Calling groups_inventory to load vars for managed_node1 13040 1726882403.73424: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882403.73427: Calling all_plugins_play to load vars for managed_node1 13040 1726882403.73428: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882403.73430: Calling groups_plugins_play to load vars for managed_node1 13040 1726882403.73510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882403.73619: done with get_vars() 13040 1726882403.73630: done queuing things up, now waiting for results queue to drain 13040 1726882403.73632: results queue empty 13040 1726882403.73632: checking for any_errors_fatal 13040 1726882403.73634: done checking for any_errors_fatal 13040 1726882403.73634: checking for max_fail_percentage 13040 1726882403.73635: done checking for max_fail_percentage 13040 1726882403.73639: checking to see if all hosts have failed and the running result is not ok 13040 1726882403.73639: done checking to see if all hosts have failed 13040 1726882403.73640: getting the remaining hosts for this loop 13040 1726882403.73640: done getting the remaining hosts for this loop 13040 1726882403.73642: getting the next task for host managed_node1 13040 1726882403.73646: done getting next task for host managed_node1 13040 1726882403.73647: ^ task is: TASK: Include the task 'el_repo_setup.yml' 13040 1726882403.73649: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882403.73650: getting variables 13040 1726882403.73651: in VariableManager get_vars() 13040 1726882403.73657: Calling all_inventory to load vars for managed_node1 13040 1726882403.73658: Calling groups_inventory to load vars for managed_node1 13040 1726882403.73659: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882403.73662: Calling all_plugins_play to load vars for managed_node1 13040 1726882403.73665: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882403.73667: Calling groups_plugins_play to load vars for managed_node1 13040 1726882403.73757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882403.73859: done with get_vars() 13040 1726882403.73867: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_initscripts.yml:10 Friday 20 September 2024 21:33:23 -0400 (0:00:01.196) 0:00:01.216 ****** 13040 1726882403.73921: entering _queue_task() for managed_node1/include_tasks 13040 1726882403.73923: Creating lock for include_tasks 13040 1726882403.74180: worker is 1 (out of 1 available) 13040 1726882403.74192: exiting _queue_task() for managed_node1/include_tasks 13040 1726882403.74202: done queuing things up, now waiting for results queue to drain 13040 1726882403.74203: waiting for pending results... 13040 1726882403.74483: running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' 13040 1726882403.74589: in run() - task 0e448fcc-3ce9-b123-314b-000000000006 13040 1726882403.74609: variable 'ansible_search_path' from source: unknown 13040 1726882403.74652: calling self._execute() 13040 1726882403.74729: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882403.74739: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882403.74756: variable 'omit' from source: magic vars 13040 1726882403.74860: _execute() done 13040 1726882403.74873: dumping result to json 13040 1726882403.74881: done dumping result, returning 13040 1726882403.74891: done running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' [0e448fcc-3ce9-b123-314b-000000000006] 13040 1726882403.74902: sending task result for task 0e448fcc-3ce9-b123-314b-000000000006 13040 1726882403.75051: no more pending results, returning what we have 13040 1726882403.75057: in VariableManager get_vars() 13040 1726882403.75093: Calling all_inventory to load vars for managed_node1 13040 1726882403.75097: Calling groups_inventory to load vars for managed_node1 13040 1726882403.75101: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882403.75115: Calling all_plugins_play to load vars for managed_node1 13040 1726882403.75118: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882403.75122: Calling groups_plugins_play to load vars for managed_node1 13040 1726882403.75308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882403.75561: done with get_vars() 13040 1726882403.75570: variable 'ansible_search_path' from source: unknown 13040 1726882403.75582: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000006 13040 1726882403.75585: WORKER PROCESS EXITING 13040 1726882403.75592: we have included files to process 13040 1726882403.75593: generating all_blocks data 13040 1726882403.75594: done generating all_blocks data 13040 1726882403.75595: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 13040 1726882403.75596: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 13040 1726882403.75598: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 13040 1726882403.76169: in VariableManager get_vars() 13040 1726882403.76179: done with get_vars() 13040 1726882403.76187: done processing included file 13040 1726882403.76188: iterating over new_blocks loaded from include file 13040 1726882403.76189: in VariableManager get_vars() 13040 1726882403.76195: done with get_vars() 13040 1726882403.76195: filtering new block on tags 13040 1726882403.76205: done filtering new block on tags 13040 1726882403.76207: in VariableManager get_vars() 13040 1726882403.76212: done with get_vars() 13040 1726882403.76213: filtering new block on tags 13040 1726882403.76223: done filtering new block on tags 13040 1726882403.76225: in VariableManager get_vars() 13040 1726882403.76232: done with get_vars() 13040 1726882403.76233: filtering new block on tags 13040 1726882403.76241: done filtering new block on tags 13040 1726882403.76242: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node1 13040 1726882403.76246: extending task lists for all hosts with included blocks 13040 1726882403.76276: done extending task lists 13040 1726882403.76277: done processing included files 13040 1726882403.76278: results queue empty 13040 1726882403.76278: checking for any_errors_fatal 13040 1726882403.76279: done checking for any_errors_fatal 13040 1726882403.76279: checking for max_fail_percentage 13040 1726882403.76280: done checking for max_fail_percentage 13040 1726882403.76280: checking to see if all hosts have failed and the running result is not ok 13040 1726882403.76281: done checking to see if all hosts have failed 13040 1726882403.76281: getting the remaining hosts for this loop 13040 1726882403.76282: done getting the remaining hosts for this loop 13040 1726882403.76284: getting the next task for host managed_node1 13040 1726882403.76286: done getting next task for host managed_node1 13040 1726882403.76287: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 13040 1726882403.76289: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882403.76290: getting variables 13040 1726882403.76291: in VariableManager get_vars() 13040 1726882403.76296: Calling all_inventory to load vars for managed_node1 13040 1726882403.76298: Calling groups_inventory to load vars for managed_node1 13040 1726882403.76299: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882403.76303: Calling all_plugins_play to load vars for managed_node1 13040 1726882403.76304: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882403.76306: Calling groups_plugins_play to load vars for managed_node1 13040 1726882403.76398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882403.76509: done with get_vars() 13040 1726882403.76514: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 21:33:23 -0400 (0:00:00.026) 0:00:01.242 ****** 13040 1726882403.76559: entering _queue_task() for managed_node1/setup 13040 1726882403.76773: worker is 1 (out of 1 available) 13040 1726882403.76783: exiting _queue_task() for managed_node1/setup 13040 1726882403.76796: done queuing things up, now waiting for results queue to drain 13040 1726882403.76798: waiting for pending results... 13040 1726882403.76952: running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 13040 1726882403.77022: in run() - task 0e448fcc-3ce9-b123-314b-0000000001cd 13040 1726882403.77031: variable 'ansible_search_path' from source: unknown 13040 1726882403.77034: variable 'ansible_search_path' from source: unknown 13040 1726882403.77066: calling self._execute() 13040 1726882403.77124: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882403.77127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882403.77137: variable 'omit' from source: magic vars 13040 1726882403.77441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882403.78991: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882403.79036: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882403.79070: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882403.79097: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882403.79115: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882403.79177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882403.79198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882403.79215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882403.79241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882403.79251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882403.79379: variable 'ansible_facts' from source: unknown 13040 1726882403.79428: variable 'network_test_required_facts' from source: task vars 13040 1726882403.79458: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 13040 1726882403.79462: variable 'omit' from source: magic vars 13040 1726882403.79495: variable 'omit' from source: magic vars 13040 1726882403.79520: variable 'omit' from source: magic vars 13040 1726882403.79540: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13040 1726882403.79562: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13040 1726882403.79577: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13040 1726882403.79590: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13040 1726882403.79600: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13040 1726882403.79624: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13040 1726882403.79627: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882403.79629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882403.79704: Set connection var ansible_shell_executable to /bin/sh 13040 1726882403.79708: Set connection var ansible_timeout to 10 13040 1726882403.79718: Set connection var ansible_pipelining to False 13040 1726882403.79722: Set connection var ansible_shell_type to sh 13040 1726882403.79726: Set connection var ansible_connection to ssh 13040 1726882403.79731: Set connection var ansible_module_compression to ZIP_DEFLATED 13040 1726882403.79749: variable 'ansible_shell_executable' from source: unknown 13040 1726882403.79755: variable 'ansible_connection' from source: unknown 13040 1726882403.79757: variable 'ansible_module_compression' from source: unknown 13040 1726882403.79760: variable 'ansible_shell_type' from source: unknown 13040 1726882403.79762: variable 'ansible_shell_executable' from source: unknown 13040 1726882403.79767: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882403.79769: variable 'ansible_pipelining' from source: unknown 13040 1726882403.79771: variable 'ansible_timeout' from source: unknown 13040 1726882403.79774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882403.79872: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13040 1726882403.79878: variable 'omit' from source: magic vars 13040 1726882403.79883: starting attempt loop 13040 1726882403.79886: running the handler 13040 1726882403.79897: _low_level_execute_command(): starting 13040 1726882403.79903: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13040 1726882403.80426: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13040 1726882403.80441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13040 1726882403.80457: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 13040 1726882403.80472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13040 1726882403.80491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13040 1726882403.80528: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13040 1726882403.80540: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13040 1726882403.80650: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13040 1726882403.82291: stdout chunk (state=3): >>>/root <<< 13040 1726882403.82391: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13040 1726882403.82451: stderr chunk (state=3): >>><<< 13040 1726882403.82454: stdout chunk (state=3): >>><<< 13040 1726882403.82479: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13040 1726882403.82490: _low_level_execute_command(): starting 13040 1726882403.82497: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882403.8247912-13135-172547929632783 `" && echo ansible-tmp-1726882403.8247912-13135-172547929632783="` echo /root/.ansible/tmp/ansible-tmp-1726882403.8247912-13135-172547929632783 `" ) && sleep 0' 13040 1726882403.82975: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13040 1726882403.82988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13040 1726882403.83010: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 13040 1726882403.83023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 13040 1726882403.83032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13040 1726882403.83080: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13040 1726882403.83091: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13040 1726882403.83196: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13040 1726882403.85069: stdout chunk (state=3): >>>ansible-tmp-1726882403.8247912-13135-172547929632783=/root/.ansible/tmp/ansible-tmp-1726882403.8247912-13135-172547929632783 <<< 13040 1726882403.85173: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13040 1726882403.85235: stderr chunk (state=3): >>><<< 13040 1726882403.85238: stdout chunk (state=3): >>><<< 13040 1726882403.85256: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882403.8247912-13135-172547929632783=/root/.ansible/tmp/ansible-tmp-1726882403.8247912-13135-172547929632783 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13040 1726882403.85297: variable 'ansible_module_compression' from source: unknown 13040 1726882403.85341: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1304074tzu_9c/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 13040 1726882403.85392: variable 'ansible_facts' from source: unknown 13040 1726882403.85510: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882403.8247912-13135-172547929632783/AnsiballZ_setup.py 13040 1726882403.85624: Sending initial data 13040 1726882403.85633: Sent initial data (154 bytes) 13040 1726882403.86322: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13040 1726882403.86325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13040 1726882403.86363: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13040 1726882403.86368: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13040 1726882403.86370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13040 1726882403.86422: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13040 1726882403.86425: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13040 1726882403.86526: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13040 1726882403.88243: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13040 1726882403.88334: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13040 1726882403.88427: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1304074tzu_9c/tmpve9p6xim /root/.ansible/tmp/ansible-tmp-1726882403.8247912-13135-172547929632783/AnsiballZ_setup.py <<< 13040 1726882403.88515: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13040 1726882403.90472: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13040 1726882403.90650: stderr chunk (state=3): >>><<< 13040 1726882403.90667: stdout chunk (state=3): >>><<< 13040 1726882403.90788: done transferring module to remote 13040 1726882403.90791: _low_level_execute_command(): starting 13040 1726882403.90793: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882403.8247912-13135-172547929632783/ /root/.ansible/tmp/ansible-tmp-1726882403.8247912-13135-172547929632783/AnsiballZ_setup.py && sleep 0' 13040 1726882403.91382: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13040 1726882403.91395: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13040 1726882403.91409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13040 1726882403.91426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13040 1726882403.91479: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 13040 1726882403.91496: stderr chunk (state=3): >>>debug2: match not found <<< 13040 1726882403.91511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13040 1726882403.91528: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13040 1726882403.91541: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 13040 1726882403.91562: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13040 1726882403.91576: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13040 1726882403.91590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13040 1726882403.91606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13040 1726882403.91618: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 13040 1726882403.91629: stderr chunk (state=3): >>>debug2: match found <<< 13040 1726882403.91641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13040 1726882403.91723: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13040 1726882403.91744: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13040 1726882403.91770: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13040 1726882403.91889: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13040 1726882403.93632: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13040 1726882403.93707: stderr chunk (state=3): >>><<< 13040 1726882403.93718: stdout chunk (state=3): >>><<< 13040 1726882403.93833: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13040 1726882403.93837: _low_level_execute_command(): starting 13040 1726882403.93839: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882403.8247912-13135-172547929632783/AnsiballZ_setup.py && sleep 0' 13040 1726882403.94529: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13040 1726882403.94542: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13040 1726882403.94558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13040 1726882403.94578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13040 1726882403.94626: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 13040 1726882403.94640: stderr chunk (state=3): >>>debug2: match not found <<< 13040 1726882403.94657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13040 1726882403.94682: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13040 1726882403.94693: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 13040 1726882403.94703: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13040 1726882403.94717: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13040 1726882403.94729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13040 1726882403.94743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13040 1726882403.94756: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 13040 1726882403.94768: stderr chunk (state=3): >>>debug2: match found <<< 13040 1726882403.94780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13040 1726882403.94862: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13040 1726882403.94885: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13040 1726882403.94900: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13040 1726882403.95029: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13040 1726882403.96988: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # <<< 13040 1726882403.96991: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 13040 1726882403.97034: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 13040 1726882403.97078: stdout chunk (state=3): >>>import 'posix' # <<< 13040 1726882403.97114: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 13040 1726882403.97117: stdout chunk (state=3): >>># installing zipimport hook <<< 13040 1726882403.97157: stdout chunk (state=3): >>>import 'time' # <<< 13040 1726882403.97160: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 13040 1726882403.97208: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 13040 1726882403.97244: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 13040 1726882403.97258: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # <<< 13040 1726882403.97273: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6f73dc0> <<< 13040 1726882403.97313: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 13040 1726882403.97341: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6f183a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6f73b20> <<< 13040 1726882403.97378: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 13040 1726882403.97397: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6f73ac0> <<< 13040 1726882403.97438: stdout chunk (state=3): >>>import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 13040 1726882403.97441: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6f18490> <<< 13040 1726882403.97473: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py <<< 13040 1726882403.97503: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6f18940> <<< 13040 1726882403.97517: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6f18670> <<< 13040 1726882403.97553: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 13040 1726882403.97558: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 13040 1726882403.97602: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 13040 1726882403.97617: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 13040 1726882403.97640: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 13040 1726882403.97660: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6ecf190> <<< 13040 1726882403.97683: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 13040 1726882403.97697: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 13040 1726882403.97784: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6ecf220> <<< 13040 1726882403.97818: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 13040 1726882403.97835: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6ef2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6ecf940> <<< 13040 1726882403.97871: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6f30880> <<< 13040 1726882403.97885: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6ec8d90> <<< 13040 1726882403.97953: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 13040 1726882403.97956: stdout chunk (state=3): >>>import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6ef2d90> <<< 13040 1726882403.98001: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6f18970> <<< 13040 1726882403.98030: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 13040 1726882403.98367: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 13040 1726882403.98397: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 13040 1726882403.98422: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 13040 1726882403.98441: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 13040 1726882403.98467: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 13040 1726882403.98485: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6e6eeb0> <<< 13040 1726882403.98517: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6e71f40> <<< 13040 1726882403.98547: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 13040 1726882403.98557: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 13040 1726882403.98593: stdout chunk (state=3): >>>import '_sre' # <<< 13040 1726882403.98613: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 13040 1726882403.98632: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 13040 1726882403.98658: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6e67610> <<< 13040 1726882403.98680: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6e6d640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6e6e370> <<< 13040 1726882403.98696: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 13040 1726882403.98760: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 13040 1726882403.98782: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 13040 1726882403.98817: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 13040 1726882403.98833: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 13040 1726882403.98874: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f6b4ce20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6b4c910> import 'itertools' # <<< 13040 1726882403.98908: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6b4cf10> <<< 13040 1726882403.98919: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 13040 1726882403.98956: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 13040 1726882403.98987: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6b4cfd0> <<< 13040 1726882403.99008: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6b5f0d0> import '_collections' # <<< 13040 1726882403.99057: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6e49d90> <<< 13040 1726882403.99079: stdout chunk (state=3): >>>import '_functools' # <<< 13040 1726882403.99091: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6e42670> <<< 13040 1726882403.99158: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' <<< 13040 1726882403.99176: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6e556d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6e75e20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 13040 1726882403.99205: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f6b5fcd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6e492b0> <<< 13040 1726882403.99249: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 13040 1726882403.99280: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f6e552e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6e7b9d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py <<< 13040 1726882403.99304: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py <<< 13040 1726882403.99329: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 13040 1726882403.99355: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6b5feb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6b5fdf0> <<< 13040 1726882403.99374: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py <<< 13040 1726882403.99410: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6b5fd60> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 13040 1726882403.99436: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 13040 1726882403.99456: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 13040 1726882403.99475: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 13040 1726882403.99515: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 13040 1726882403.99541: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6b323d0> <<< 13040 1726882403.99568: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 13040 1726882403.99582: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 13040 1726882403.99602: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6b324c0> <<< 13040 1726882403.99725: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6b66f40> <<< 13040 1726882403.99770: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6b61a90> <<< 13040 1726882403.99799: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6b61490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py <<< 13040 1726882403.99822: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 13040 1726882403.99840: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 13040 1726882403.99873: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 13040 1726882403.99886: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6a5b220> <<< 13040 1726882403.99915: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6b1d520> <<< 13040 1726882403.99967: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6b61f10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6e7b040> <<< 13040 1726882403.99987: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 13040 1726882404.00007: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 13040 1726882404.00047: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6a6db50> import 'errno' # <<< 13040 1726882404.00096: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f6a6de80> <<< 13040 1726882404.00108: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 13040 1726882404.00142: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' <<< 13040 1726882404.00163: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6a7e790> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 13040 1726882404.00192: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 13040 1726882404.00220: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6a7ecd0> <<< 13040 1726882404.00279: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f6a0c400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6a6df70> <<< 13040 1726882404.00294: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 13040 1726882404.00349: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' <<< 13040 1726882404.00378: stdout chunk (state=3): >>>import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f6a1d2e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6a7e610> import 'pwd' # <<< 13040 1726882404.00390: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f6a1d3a0> <<< 13040 1726882404.00417: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6b5fa30> <<< 13040 1726882404.00449: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 13040 1726882404.00461: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 13040 1726882404.00492: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 13040 1726882404.00536: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f6a38700> <<< 13040 1726882404.00569: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 13040 1726882404.00610: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f6a389d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6a387c0> <<< 13040 1726882404.00637: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f6a388b0> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 13040 1726882404.00826: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f6a38d00> <<< 13040 1726882404.00875: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' <<< 13040 1726882404.00896: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f6a43250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6a38940> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6a2ca90> <<< 13040 1726882404.00910: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6b5f610> <<< 13040 1726882404.00919: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 13040 1726882404.00981: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 13040 1726882404.01011: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6a38af0> <<< 13040 1726882404.01155: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 13040 1726882404.01168: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f76f695c6d0> <<< 13040 1726882404.01409: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip' <<< 13040 1726882404.01415: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.01505: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.01527: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/__init__.py <<< 13040 1726882404.01539: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.01544: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.01562: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/__init__.py <<< 13040 1726882404.01571: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.02776: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.03695: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py <<< 13040 1726882404.03700: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f62ba820> <<< 13040 1726882404.03703: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 13040 1726882404.03726: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 13040 1726882404.03748: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 13040 1726882404.03776: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' <<< 13040 1726882404.03781: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f634a730> <<< 13040 1726882404.03817: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f634a610> <<< 13040 1726882404.03844: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f634a340> <<< 13040 1726882404.03869: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 13040 1726882404.03916: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f634a460> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f634a160> <<< 13040 1726882404.03923: stdout chunk (state=3): >>>import 'atexit' # <<< 13040 1726882404.03956: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' <<< 13040 1726882404.03963: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f634a3a0> <<< 13040 1726882404.03970: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 13040 1726882404.03995: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 13040 1726882404.04030: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f634a790> <<< 13040 1726882404.04057: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 13040 1726882404.04068: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 13040 1726882404.04087: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 13040 1726882404.04101: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 13040 1726882404.04123: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py <<< 13040 1726882404.04127: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 13040 1726882404.04214: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6339820> <<< 13040 1726882404.04245: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' <<< 13040 1726882404.04250: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f6339490> <<< 13040 1726882404.04275: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' <<< 13040 1726882404.04289: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f6339640> <<< 13040 1726882404.04300: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 13040 1726882404.04306: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 13040 1726882404.04333: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f623f520> <<< 13040 1726882404.04352: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6344d60> <<< 13040 1726882404.04516: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f634a4f0> <<< 13040 1726882404.04533: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 13040 1726882404.04558: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f63441c0> <<< 13040 1726882404.04574: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 13040 1726882404.04583: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 13040 1726882404.04615: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py <<< 13040 1726882404.04624: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 13040 1726882404.04640: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 13040 1726882404.04646: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 13040 1726882404.04672: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py <<< 13040 1726882404.04679: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6348b20> <<< 13040 1726882404.04762: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6318160> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6318760> <<< 13040 1726882404.04766: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6245d30> <<< 13040 1726882404.04787: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' <<< 13040 1726882404.04792: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f6318670> <<< 13040 1726882404.04814: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' <<< 13040 1726882404.04823: stdout chunk (state=3): >>>import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f639ad00> <<< 13040 1726882404.04833: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 13040 1726882404.04854: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 13040 1726882404.04867: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 13040 1726882404.04904: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 13040 1726882404.04968: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' <<< 13040 1726882404.04979: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f629ba00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f63a4e80> <<< 13040 1726882404.04992: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 13040 1726882404.05001: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 13040 1726882404.05055: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' <<< 13040 1726882404.05063: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f62aa0a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f63a4eb0> <<< 13040 1726882404.05078: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 13040 1726882404.05118: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 13040 1726882404.05137: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py <<< 13040 1726882404.05147: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 13040 1726882404.05205: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f63ac250> <<< 13040 1726882404.05338: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f62aa0d0> <<< 13040 1726882404.05427: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' <<< 13040 1726882404.05443: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f63aca60> <<< 13040 1726882404.05487: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f636eb80> <<< 13040 1726882404.05502: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f63a4cd0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f639aee0> <<< 13040 1726882404.05525: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 13040 1726882404.05542: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 13040 1726882404.05560: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 13040 1726882404.05615: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f62a60d0> <<< 13040 1726882404.05801: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f629d310> <<< 13040 1726882404.05851: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f62a6cd0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f62a6670> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f62a7100> # zipimport: zlib available # zipimport: zlib available <<< 13040 1726882404.05873: stdout chunk (state=3): >>>import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py <<< 13040 1726882404.05885: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.05954: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.06047: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13040 1726882404.06079: stdout chunk (state=3): >>>import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available <<< 13040 1726882404.06183: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.06276: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.06738: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.07193: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py <<< 13040 1726882404.07207: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 13040 1726882404.07229: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 13040 1726882404.07239: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 13040 1726882404.07292: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f62e4910> <<< 13040 1726882404.07359: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 13040 1726882404.07377: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f62e99a0> <<< 13040 1726882404.07384: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f5e42640> <<< 13040 1726882404.07427: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py <<< 13040 1726882404.07433: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.07461: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.07478: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/_text.py <<< 13040 1726882404.07484: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.07598: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.07722: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py <<< 13040 1726882404.07728: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 13040 1726882404.07750: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f63207f0> <<< 13040 1726882404.07760: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.08145: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.08515: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.08567: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.08632: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/common/collections.py <<< 13040 1726882404.08637: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.08672: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.08724: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py <<< 13040 1726882404.08727: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.08766: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.08847: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available <<< 13040 1726882404.08865: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py <<< 13040 1726882404.08877: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.08915: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.08949: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 13040 1726882404.08952: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.09143: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.09330: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 13040 1726882404.09359: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 13040 1726882404.09376: stdout chunk (state=3): >>>import '_ast' # <<< 13040 1726882404.09440: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6366460> # zipimport: zlib available <<< 13040 1726882404.09509: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.09570: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py <<< 13040 1726882404.09574: stdout chunk (state=3): >>>import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py <<< 13040 1726882404.09588: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 13040 1726882404.09603: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.09637: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.09673: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/common/locale.py <<< 13040 1726882404.09686: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.09714: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.09755: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.09841: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.09900: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 13040 1726882404.09924: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 13040 1726882404.09997: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f62d80d0> <<< 13040 1726882404.10100: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f62e91f0> <<< 13040 1726882404.10130: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/common/process.py <<< 13040 1726882404.10138: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.10194: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.10270: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.10285: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.10314: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 13040 1726882404.10331: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 13040 1726882404.10342: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 13040 1726882404.10375: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 13040 1726882404.10398: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 13040 1726882404.10419: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 13040 1726882404.10494: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f62ebbb0> <<< 13040 1726882404.10535: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f63b5070> <<< 13040 1726882404.10615: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f62dc2e0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available <<< 13040 1726882404.10634: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.10657: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 13040 1726882404.10731: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/basic.py <<< 13040 1726882404.10758: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/modules/__init__.py <<< 13040 1726882404.10774: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.10816: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.10873: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.10897: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.10909: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.10944: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.10982: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.11008: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.11048: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py <<< 13040 1726882404.11059: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.11115: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.11180: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.11238: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.11275: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available <<< 13040 1726882404.11379: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.11515: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.11553: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.11599: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 13040 1726882404.11625: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py <<< 13040 1726882404.11631: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 13040 1726882404.11660: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py <<< 13040 1726882404.11672: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 13040 1726882404.11689: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f5df5400> <<< 13040 1726882404.11715: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 13040 1726882404.11743: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 13040 1726882404.11766: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 13040 1726882404.11793: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py <<< 13040 1726882404.11800: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' <<< 13040 1726882404.11813: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f5e549a0> <<< 13040 1726882404.11846: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' <<< 13040 1726882404.11854: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f5e54df0> <<< 13040 1726882404.11909: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f5e52490> <<< 13040 1726882404.11931: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f5ccd040> <<< 13040 1726882404.11954: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f5baec70> <<< 13040 1726882404.11968: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f5baea30> <<< 13040 1726882404.11980: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 13040 1726882404.12005: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' <<< 13040 1726882404.12020: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py <<< 13040 1726882404.12028: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 13040 1726882404.12057: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' <<< 13040 1726882404.12070: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f62d76d0> <<< 13040 1726882404.12077: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f5e41730> <<< 13040 1726882404.12091: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py <<< 13040 1726882404.12101: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 13040 1726882404.12135: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f62d75e0> <<< 13040 1726882404.12142: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 13040 1726882404.12168: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 13040 1726882404.12196: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' <<< 13040 1726882404.12200: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f5e04c70> <<< 13040 1726882404.12222: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f5c1c9a0> <<< 13040 1726882404.12254: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f5baeb20> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py <<< 13040 1726882404.12269: stdout chunk (state=3): >>>import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py <<< 13040 1726882404.12281: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13040 1726882404.12293: stdout chunk (state=3): >>>import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py <<< 13040 1726882404.12303: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.12360: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.12407: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py <<< 13040 1726882404.12412: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.12451: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.12497: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py <<< 13040 1726882404.12500: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.12519: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.12523: stdout chunk (state=3): >>>import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py <<< 13040 1726882404.12541: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.12568: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.12597: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available <<< 13040 1726882404.12646: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.12687: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py <<< 13040 1726882404.12695: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.12726: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.12769: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py <<< 13040 1726882404.12773: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.12826: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.12871: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.12922: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.12968: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py <<< 13040 1726882404.12978: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py <<< 13040 1726882404.12981: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.13378: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.13735: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py <<< 13040 1726882404.13742: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.13785: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.13837: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.13867: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.13907: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py <<< 13040 1726882404.13920: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.13926: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.13960: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py <<< 13040 1726882404.14024: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.14034: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.14059: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py <<< 13040 1726882404.14082: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.14132: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.14155: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available <<< 13040 1726882404.14162: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.14182: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py <<< 13040 1726882404.14192: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.14259: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.14326: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 13040 1726882404.14346: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f5bae640> <<< 13040 1726882404.14372: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 13040 1726882404.14393: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 13040 1726882404.14541: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f5b3af40> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py <<< 13040 1726882404.14558: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.14611: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.14667: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py <<< 13040 1726882404.14676: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.14747: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.14827: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py <<< 13040 1726882404.14836: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.14884: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.14950: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py <<< 13040 1726882404.14958: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.14997: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.15032: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 13040 1726882404.15059: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 13040 1726882404.15200: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f5b353a0> <<< 13040 1726882404.15439: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f5b82100> <<< 13040 1726882404.15442: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py <<< 13040 1726882404.15447: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.15495: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.15544: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py <<< 13040 1726882404.15550: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.15618: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.15691: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.15781: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.15909: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py <<< 13040 1726882404.15926: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.15958: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.15995: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py <<< 13040 1726882404.16001: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.16035: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.16084: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py <<< 13040 1726882404.16089: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 13040 1726882404.16133: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' <<< 13040 1726882404.16144: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f5ac86a0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f5ac8a90> <<< 13040 1726882404.16160: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py <<< 13040 1726882404.16173: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.16179: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.16182: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py <<< 13040 1726882404.16189: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.16217: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.16267: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available <<< 13040 1726882404.16396: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.16522: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py <<< 13040 1726882404.16528: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.16606: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.16691: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.16721: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.16759: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py <<< 13040 1726882404.16771: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py <<< 13040 1726882404.16778: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.16843: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.16868: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.16980: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.17097: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py <<< 13040 1726882404.17115: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.17213: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.17317: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py <<< 13040 1726882404.17322: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.17355: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.17388: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.17815: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.18218: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py <<< 13040 1726882404.18226: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py <<< 13040 1726882404.18229: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.18319: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.18409: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py <<< 13040 1726882404.18417: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.18493: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.18580: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py <<< 13040 1726882404.18585: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.18715: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.18838: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py <<< 13040 1726882404.18865: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.18874: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.18877: stdout chunk (state=3): >>>import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py <<< 13040 1726882404.18884: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.18919: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.18958: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py <<< 13040 1726882404.18969: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.19047: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.19131: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.19299: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.19473: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 13040 1726882404.19484: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.19509: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.19549: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py <<< 13040 1726882404.19558: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.19581: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.19594: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py <<< 13040 1726882404.19606: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.19666: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.19724: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py <<< 13040 1726882404.19728: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.19761: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.19775: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py <<< 13040 1726882404.19788: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.19830: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.19884: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py <<< 13040 1726882404.19890: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.19940: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.19990: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py <<< 13040 1726882404.20001: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.20211: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.20427: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py <<< 13040 1726882404.20432: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.20482: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.20532: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py <<< 13040 1726882404.20541: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.20570: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.20607: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py <<< 13040 1726882404.20610: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.20638: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.20668: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py <<< 13040 1726882404.20681: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.20701: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.20736: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py <<< 13040 1726882404.20741: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.20810: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.20873: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py <<< 13040 1726882404.20896: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.20902: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.20911: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py <<< 13040 1726882404.20914: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.20956: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.20998: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py <<< 13040 1726882404.21001: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.21018: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.21043: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.21079: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.21122: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.21178: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.21237: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py <<< 13040 1726882404.21262: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py <<< 13040 1726882404.21265: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.21306: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.21344: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py <<< 13040 1726882404.21356: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.21512: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.21674: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py <<< 13040 1726882404.21679: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.21719: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.21763: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py <<< 13040 1726882404.21770: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.21805: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.21851: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py <<< 13040 1726882404.21858: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.21925: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.21989: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py <<< 13040 1726882404.21998: stdout chunk (state=3): >>>import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py <<< 13040 1726882404.22001: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.22079: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.22150: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py <<< 13040 1726882404.22155: stdout chunk (state=3): >>>import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 13040 1726882404.22232: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.22418: stdout chunk (state=3): >>>import 'gc' # <<< 13040 1726882404.22803: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py <<< 13040 1726882404.22809: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 13040 1726882404.22834: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py <<< 13040 1726882404.22838: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 13040 1726882404.22886: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f5aaa5e0> <<< 13040 1726882404.22891: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f5aac0d0> <<< 13040 1726882404.22961: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f5aac8e0> <<< 13040 1726882404.25024: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "33", "second": "24", "epoch": "1726882404", "epoch_int": "1726882404", "date": "2024-09-20", "time": "21:33:24", "iso8601_micro": "2024-09-21T01:33:24.222892Z", "iso8601": "2024-09-21T01:33:24Z", "iso8601_basic": "20240920T213324222892", "iso8601_basic_short": "20240920T213324", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_s<<< 13040 1726882404.25030: stdout chunk (state=3): >>>sh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 13040 1726882404.25552: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv <<< 13040 1726882404.25557: stdout chunk (state=3): >>># clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread<<< 13040 1726882404.25560: stdout chunk (state=3): >>> # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings <<< 13040 1726882404.25686: stdout chunk (state=3): >>># cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath <<< 13040 1726882404.25715: stdout chunk (state=3): >>># cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre<<< 13040 1726882404.25786: stdout chunk (state=3): >>> # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword <<< 13040 1726882404.25830: stdout chunk (state=3): >>># cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util <<< 13040 1726882404.25859: stdout chunk (state=3): >>># cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression <<< 13040 1726882404.25929: stdout chunk (state=3): >>># cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma <<< 13040 1726882404.25952: stdout chunk (state=3): >>># cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib <<< 13040 1726882404.25986: stdout chunk (state=3): >>># cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils <<< 13040 1726882404.26001: stdout chunk (state=3): >>># destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors <<< 13040 1726882404.26004: stdout chunk (state=3): >>># cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback <<< 13040 1726882404.26006: stdout chunk (state=3): >>># cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal <<< 13040 1726882404.26008: stdout chunk (state=3): >>># cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six <<< 13040 1726882404.26009: stdout chunk (state=3): >>># destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes <<< 13040 1726882404.26013: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections <<< 13040 1726882404.26014: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast <<< 13040 1726882404.26016: stdout chunk (state=3): >>># destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters <<< 13040 1726882404.26018: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 <<< 13040 1726882404.26019: stdout chunk (state=3): >>># cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro<<< 13040 1726882404.26020: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue <<< 13040 1726882404.26024: stdout chunk (state=3): >>># cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter <<< 13040 1726882404.26025: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env <<< 13040 1726882404.26027: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform <<< 13040 1726882404.26028: stdout chunk (state=3): >>># cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass <<< 13040 1726882404.26029: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly <<< 13040 1726882404.26031: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin<<< 13040 1726882404.26032: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos<<< 13040 1726882404.26033: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd<<< 13040 1726882404.26034: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts<<< 13040 1726882404.26035: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local<<< 13040 1726882404.26036: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix <<< 13040 1726882404.26038: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd <<< 13040 1726882404.26039: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux<<< 13040 1726882404.26040: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat <<< 13040 1726882404.26042: stdout chunk (state=3): >>># cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 13040 1726882404.26278: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 13040 1726882404.26297: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 13040 1726882404.26324: stdout chunk (state=3): >>># destroy zipimport <<< 13040 1726882404.26338: stdout chunk (state=3): >>># destroy _compression <<< 13040 1726882404.26347: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 13040 1726882404.26382: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib <<< 13040 1726882404.26386: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 13040 1726882404.26388: stdout chunk (state=3): >>># destroy _json # destroy encodings <<< 13040 1726882404.26409: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 13040 1726882404.26454: stdout chunk (state=3): >>># destroy selinux <<< 13040 1726882404.26457: stdout chunk (state=3): >>># destroy distro # destroy logging # destroy argparse <<< 13040 1726882404.26495: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector <<< 13040 1726882404.26498: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool <<< 13040 1726882404.26506: stdout chunk (state=3): >>># destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 13040 1726882404.26538: stdout chunk (state=3): >>># destroy queue <<< 13040 1726882404.26561: stdout chunk (state=3): >>># destroy multiprocessing.process <<< 13040 1726882404.26569: stdout chunk (state=3): >>># destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction <<< 13040 1726882404.26572: stdout chunk (state=3): >>># destroy shlex <<< 13040 1726882404.26578: stdout chunk (state=3): >>># destroy datetime <<< 13040 1726882404.26584: stdout chunk (state=3): >>># destroy base64 <<< 13040 1726882404.26597: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass <<< 13040 1726882404.26609: stdout chunk (state=3): >>># destroy json <<< 13040 1726882404.26624: stdout chunk (state=3): >>># destroy socket # destroy struct <<< 13040 1726882404.26629: stdout chunk (state=3): >>># destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 13040 1726882404.26672: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata <<< 13040 1726882404.26701: stdout chunk (state=3): >>># cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux <<< 13040 1726882404.26718: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian <<< 13040 1726882404.26742: stdout chunk (state=3): >>># cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid <<< 13040 1726882404.26753: stdout chunk (state=3): >>># cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform <<< 13040 1726882404.26780: stdout chunk (state=3): >>># destroy subprocess <<< 13040 1726882404.26791: stdout chunk (state=3): >>># cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil <<< 13040 1726882404.26809: stdout chunk (state=3): >>># destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd <<< 13040 1726882404.26838: stdout chunk (state=3): >>># cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib <<< 13040 1726882404.26852: stdout chunk (state=3): >>># cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re <<< 13040 1726882404.26866: stdout chunk (state=3): >>># destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq <<< 13040 1726882404.26880: stdout chunk (state=3): >>># destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse <<< 13040 1726882404.26890: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os <<< 13040 1726882404.26893: stdout chunk (state=3): >>># cleanup[3] wiping os.path <<< 13040 1726882404.26895: stdout chunk (state=3): >>># destroy genericpath # cleanup[3] wiping posixpath <<< 13040 1726882404.26897: stdout chunk (state=3): >>># cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref <<< 13040 1726882404.26899: stdout chunk (state=3): >>># cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 13040 1726882404.26901: stdout chunk (state=3): >>># cleanup[3] wiping builtins <<< 13040 1726882404.26925: stdout chunk (state=3): >>># destroy unicodedata # destroy gc # destroy termios <<< 13040 1726882404.26938: stdout chunk (state=3): >>># destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket <<< 13040 1726882404.26942: stdout chunk (state=3): >>># destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 13040 1726882404.27110: stdout chunk (state=3): >>># destroy platform <<< 13040 1726882404.27123: stdout chunk (state=3): >>># destroy _uuid # destroy _sre # destroy sre_parse <<< 13040 1726882404.27136: stdout chunk (state=3): >>># destroy tokenize # destroy _heapq # destroy posixpath <<< 13040 1726882404.27140: stdout chunk (state=3): >>># destroy stat <<< 13040 1726882404.27156: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 13040 1726882404.27177: stdout chunk (state=3): >>># destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator <<< 13040 1726882404.27190: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves # destroy _operator <<< 13040 1726882404.27196: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 13040 1726882404.27230: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 13040 1726882404.27539: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 13040 1726882404.27602: stderr chunk (state=3): >>><<< 13040 1726882404.27606: stdout chunk (state=3): >>><<< 13040 1726882404.27713: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6f73dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6f183a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6f73b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6f73ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6f18490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6f18940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6f18670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6ecf190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6ecf220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6ef2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6ecf940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6f30880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6ec8d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6ef2d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6f18970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6e6eeb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6e71f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6e67610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6e6d640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6e6e370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f6b4ce20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6b4c910> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6b4cf10> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6b4cfd0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6b5f0d0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6e49d90> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6e42670> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6e556d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6e75e20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f6b5fcd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6e492b0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f6e552e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6e7b9d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6b5feb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6b5fdf0> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6b5fd60> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6b323d0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6b324c0> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6b66f40> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6b61a90> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6b61490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6a5b220> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6b1d520> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6b61f10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6e7b040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6a6db50> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f6a6de80> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6a7e790> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6a7ecd0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f6a0c400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6a6df70> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f6a1d2e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6a7e610> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f6a1d3a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6b5fa30> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f6a38700> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f6a389d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6a387c0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f6a388b0> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f6a38d00> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f6a43250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6a38940> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6a2ca90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6b5f610> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6a38af0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f76f695c6d0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f62ba820> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f634a730> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f634a610> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f634a340> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f634a460> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f634a160> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f634a3a0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f634a790> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6339820> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f6339490> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f6339640> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f623f520> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6344d60> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f634a4f0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f63441c0> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6348b20> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6318160> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6318760> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6245d30> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f6318670> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f639ad00> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f629ba00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f63a4e80> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f62aa0a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f63a4eb0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f63ac250> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f62aa0d0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f63aca60> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f636eb80> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f63a4cd0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f639aee0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f62a60d0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f629d310> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f62a6cd0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f62a6670> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f62a7100> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f62e4910> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f62e99a0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f5e42640> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f63207f0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f6366460> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f62d80d0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f62e91f0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f62ebbb0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f63b5070> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f62dc2e0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f5df5400> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f5e549a0> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f5e54df0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f5e52490> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f5ccd040> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f5baec70> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f5baea30> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f62d76d0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f5e41730> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f62d75e0> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f5e04c70> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f5c1c9a0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f5baeb20> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f5bae640> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f5b3af40> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f5b353a0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f5b82100> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f5ac86a0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f5ac8a90> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_zmeg55c3/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available import 'gc' # # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76f5aaa5e0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f5aac0d0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76f5aac8e0> {"ansible_facts": {"ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "33", "second": "24", "epoch": "1726882404", "epoch_int": "1726882404", "date": "2024-09-20", "time": "21:33:24", "iso8601_micro": "2024-09-21T01:33:24.222892Z", "iso8601": "2024-09-21T01:33:24Z", "iso8601_basic": "20240920T213324222892", "iso8601_basic_short": "20240920T213324", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 13040 1726882404.28569: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882403.8247912-13135-172547929632783/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13040 1726882404.28573: _low_level_execute_command(): starting 13040 1726882404.28575: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882403.8247912-13135-172547929632783/ > /dev/null 2>&1 && sleep 0' 13040 1726882404.28601: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13040 1726882404.28605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13040 1726882404.28647: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13040 1726882404.28651: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 13040 1726882404.28653: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 13040 1726882404.28655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13040 1726882404.28707: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13040 1726882404.28717: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13040 1726882404.28723: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13040 1726882404.28825: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13040 1726882404.30640: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13040 1726882404.30695: stderr chunk (state=3): >>><<< 13040 1726882404.30698: stdout chunk (state=3): >>><<< 13040 1726882404.30714: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13040 1726882404.30721: handler run complete 13040 1726882404.30748: variable 'ansible_facts' from source: unknown 13040 1726882404.30791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882404.30864: variable 'ansible_facts' from source: unknown 13040 1726882404.30897: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882404.30935: attempt loop complete, returning result 13040 1726882404.30938: _execute() done 13040 1726882404.30940: dumping result to json 13040 1726882404.30947: done dumping result, returning 13040 1726882404.30956: done running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [0e448fcc-3ce9-b123-314b-0000000001cd] 13040 1726882404.30961: sending task result for task 0e448fcc-3ce9-b123-314b-0000000001cd ok: [managed_node1] 13040 1726882404.31204: no more pending results, returning what we have 13040 1726882404.31207: results queue empty 13040 1726882404.31207: checking for any_errors_fatal 13040 1726882404.31209: done checking for any_errors_fatal 13040 1726882404.31210: checking for max_fail_percentage 13040 1726882404.31211: done checking for max_fail_percentage 13040 1726882404.31212: checking to see if all hosts have failed and the running result is not ok 13040 1726882404.31213: done checking to see if all hosts have failed 13040 1726882404.31213: getting the remaining hosts for this loop 13040 1726882404.31215: done getting the remaining hosts for this loop 13040 1726882404.31218: getting the next task for host managed_node1 13040 1726882404.31226: done getting next task for host managed_node1 13040 1726882404.31228: ^ task is: TASK: Check if system is ostree 13040 1726882404.31231: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882404.31234: getting variables 13040 1726882404.31235: in VariableManager get_vars() 13040 1726882404.31262: Calling all_inventory to load vars for managed_node1 13040 1726882404.31266: Calling groups_inventory to load vars for managed_node1 13040 1726882404.31269: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882404.31279: Calling all_plugins_play to load vars for managed_node1 13040 1726882404.31281: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882404.31283: Calling groups_plugins_play to load vars for managed_node1 13040 1726882404.31388: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000001cd 13040 1726882404.31391: WORKER PROCESS EXITING 13040 1726882404.31421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882404.31539: done with get_vars() 13040 1726882404.31546: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 21:33:24 -0400 (0:00:00.550) 0:00:01.793 ****** 13040 1726882404.31612: entering _queue_task() for managed_node1/stat 13040 1726882404.31801: worker is 1 (out of 1 available) 13040 1726882404.31812: exiting _queue_task() for managed_node1/stat 13040 1726882404.31823: done queuing things up, now waiting for results queue to drain 13040 1726882404.31824: waiting for pending results... 13040 1726882404.31970: running TaskExecutor() for managed_node1/TASK: Check if system is ostree 13040 1726882404.32036: in run() - task 0e448fcc-3ce9-b123-314b-0000000001cf 13040 1726882404.32046: variable 'ansible_search_path' from source: unknown 13040 1726882404.32050: variable 'ansible_search_path' from source: unknown 13040 1726882404.32082: calling self._execute() 13040 1726882404.32137: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882404.32140: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882404.32148: variable 'omit' from source: magic vars 13040 1726882404.32415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13040 1726882404.32605: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13040 1726882404.32637: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13040 1726882404.32662: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13040 1726882404.32690: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13040 1726882404.32754: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13040 1726882404.32772: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13040 1726882404.32789: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882404.32806: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13040 1726882404.32897: Evaluated conditional (not __network_is_ostree is defined): True 13040 1726882404.32901: variable 'omit' from source: magic vars 13040 1726882404.32927: variable 'omit' from source: magic vars 13040 1726882404.32958: variable 'omit' from source: magic vars 13040 1726882404.32975: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13040 1726882404.32996: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13040 1726882404.33010: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13040 1726882404.33023: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13040 1726882404.33032: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13040 1726882404.33056: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13040 1726882404.33061: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882404.33064: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882404.33126: Set connection var ansible_shell_executable to /bin/sh 13040 1726882404.33130: Set connection var ansible_timeout to 10 13040 1726882404.33139: Set connection var ansible_pipelining to False 13040 1726882404.33145: Set connection var ansible_shell_type to sh 13040 1726882404.33148: Set connection var ansible_connection to ssh 13040 1726882404.33156: Set connection var ansible_module_compression to ZIP_DEFLATED 13040 1726882404.33170: variable 'ansible_shell_executable' from source: unknown 13040 1726882404.33173: variable 'ansible_connection' from source: unknown 13040 1726882404.33177: variable 'ansible_module_compression' from source: unknown 13040 1726882404.33179: variable 'ansible_shell_type' from source: unknown 13040 1726882404.33181: variable 'ansible_shell_executable' from source: unknown 13040 1726882404.33183: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882404.33186: variable 'ansible_pipelining' from source: unknown 13040 1726882404.33188: variable 'ansible_timeout' from source: unknown 13040 1726882404.33192: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882404.33286: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13040 1726882404.33294: variable 'omit' from source: magic vars 13040 1726882404.33301: starting attempt loop 13040 1726882404.33304: running the handler 13040 1726882404.33313: _low_level_execute_command(): starting 13040 1726882404.33320: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13040 1726882404.33828: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13040 1726882404.33848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13040 1726882404.33868: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13040 1726882404.33882: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13040 1726882404.33927: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13040 1726882404.33938: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13040 1726882404.34043: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13040 1726882404.35613: stdout chunk (state=3): >>>/root <<< 13040 1726882404.35708: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13040 1726882404.35761: stderr chunk (state=3): >>><<< 13040 1726882404.35769: stdout chunk (state=3): >>><<< 13040 1726882404.35787: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13040 1726882404.35800: _low_level_execute_command(): starting 13040 1726882404.35805: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882404.3578727-13151-119549827532451 `" && echo ansible-tmp-1726882404.3578727-13151-119549827532451="` echo /root/.ansible/tmp/ansible-tmp-1726882404.3578727-13151-119549827532451 `" ) && sleep 0' 13040 1726882404.36254: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13040 1726882404.36268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13040 1726882404.36304: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13040 1726882404.36317: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13040 1726882404.36327: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13040 1726882404.36378: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13040 1726882404.36387: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13040 1726882404.36495: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13040 1726882404.38332: stdout chunk (state=3): >>>ansible-tmp-1726882404.3578727-13151-119549827532451=/root/.ansible/tmp/ansible-tmp-1726882404.3578727-13151-119549827532451 <<< 13040 1726882404.38434: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13040 1726882404.38494: stderr chunk (state=3): >>><<< 13040 1726882404.38497: stdout chunk (state=3): >>><<< 13040 1726882404.38515: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882404.3578727-13151-119549827532451=/root/.ansible/tmp/ansible-tmp-1726882404.3578727-13151-119549827532451 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13040 1726882404.38567: variable 'ansible_module_compression' from source: unknown 13040 1726882404.38613: ANSIBALLZ: Using lock for stat 13040 1726882404.38616: ANSIBALLZ: Acquiring lock 13040 1726882404.38619: ANSIBALLZ: Lock acquired: 139648575893872 13040 1726882404.38621: ANSIBALLZ: Creating module 13040 1726882404.46948: ANSIBALLZ: Writing module into payload 13040 1726882404.47026: ANSIBALLZ: Writing module 13040 1726882404.47044: ANSIBALLZ: Renaming module 13040 1726882404.47049: ANSIBALLZ: Done creating module 13040 1726882404.47065: variable 'ansible_facts' from source: unknown 13040 1726882404.47120: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882404.3578727-13151-119549827532451/AnsiballZ_stat.py 13040 1726882404.47232: Sending initial data 13040 1726882404.47236: Sent initial data (153 bytes) 13040 1726882404.48069: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13040 1726882404.48072: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13040 1726882404.48075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13040 1726882404.48077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13040 1726882404.48080: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 13040 1726882404.48082: stderr chunk (state=3): >>>debug2: match not found <<< 13040 1726882404.48084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13040 1726882404.48114: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13040 1726882404.48117: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 13040 1726882404.48120: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13040 1726882404.48122: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13040 1726882404.48124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13040 1726882404.48166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13040 1726882404.48169: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 13040 1726882404.48171: stderr chunk (state=3): >>>debug2: match found <<< 13040 1726882404.48173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13040 1726882404.48219: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13040 1726882404.48239: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13040 1726882404.48248: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13040 1726882404.48382: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13040 1726882404.50129: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13040 1726882404.50218: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13040 1726882404.50312: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1304074tzu_9c/tmpbz4aml1g /root/.ansible/tmp/ansible-tmp-1726882404.3578727-13151-119549827532451/AnsiballZ_stat.py <<< 13040 1726882404.50399: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13040 1726882404.51574: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13040 1726882404.51724: stderr chunk (state=3): >>><<< 13040 1726882404.51727: stdout chunk (state=3): >>><<< 13040 1726882404.51747: done transferring module to remote 13040 1726882404.51767: _low_level_execute_command(): starting 13040 1726882404.51770: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882404.3578727-13151-119549827532451/ /root/.ansible/tmp/ansible-tmp-1726882404.3578727-13151-119549827532451/AnsiballZ_stat.py && sleep 0' 13040 1726882404.52421: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13040 1726882404.52430: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13040 1726882404.52440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13040 1726882404.52454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13040 1726882404.52498: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 13040 1726882404.52504: stderr chunk (state=3): >>>debug2: match not found <<< 13040 1726882404.52514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13040 1726882404.52527: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13040 1726882404.52534: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 13040 1726882404.52541: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13040 1726882404.52548: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13040 1726882404.52560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13040 1726882404.52575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13040 1726882404.52584: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 13040 1726882404.52591: stderr chunk (state=3): >>>debug2: match found <<< 13040 1726882404.52598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13040 1726882404.52681: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13040 1726882404.52700: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13040 1726882404.52711: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13040 1726882404.52834: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13040 1726882404.54571: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13040 1726882404.54648: stderr chunk (state=3): >>><<< 13040 1726882404.54651: stdout chunk (state=3): >>><<< 13040 1726882404.54677: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13040 1726882404.54680: _low_level_execute_command(): starting 13040 1726882404.54685: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882404.3578727-13151-119549827532451/AnsiballZ_stat.py && sleep 0' 13040 1726882404.55386: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13040 1726882404.55395: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13040 1726882404.55406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13040 1726882404.55418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13040 1726882404.55466: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 13040 1726882404.55472: stderr chunk (state=3): >>>debug2: match not found <<< 13040 1726882404.55482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13040 1726882404.55495: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13040 1726882404.55502: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 13040 1726882404.55510: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13040 1726882404.55517: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13040 1726882404.55525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13040 1726882404.55536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13040 1726882404.55543: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 13040 1726882404.55549: stderr chunk (state=3): >>>debug2: match found <<< 13040 1726882404.55562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13040 1726882404.55636: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13040 1726882404.55659: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13040 1726882404.55673: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13040 1726882404.55810: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13040 1726882404.57760: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # <<< 13040 1726882404.57767: stdout chunk (state=3): >>>import '_weakref' # <<< 13040 1726882404.57810: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 13040 1726882404.57843: stdout chunk (state=3): >>>import 'posix' # <<< 13040 1726882404.57879: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 13040 1726882404.57882: stdout chunk (state=3): >>># installing zipimport hook <<< 13040 1726882404.57922: stdout chunk (state=3): >>>import 'time' # <<< 13040 1726882404.57925: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 13040 1726882404.57980: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 13040 1726882404.57997: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 13040 1726882404.58015: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # <<< 13040 1726882404.58043: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd843dc0> <<< 13040 1726882404.58086: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 13040 1726882404.58099: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd5d83a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd843b20> <<< 13040 1726882404.58135: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 13040 1726882404.58148: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd843ac0> <<< 13040 1726882404.58171: stdout chunk (state=3): >>>import '_signal' # <<< 13040 1726882404.58197: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 13040 1726882404.58212: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd5d8490> <<< 13040 1726882404.58246: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 13040 1726882404.58260: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 13040 1726882404.58282: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd5d8940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd5d8670> <<< 13040 1726882404.58325: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 13040 1726882404.58328: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 13040 1726882404.58354: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 13040 1726882404.58382: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 13040 1726882404.58402: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 13040 1726882404.58436: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd58f190> <<< 13040 1726882404.58449: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 13040 1726882404.58471: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 13040 1726882404.58554: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd58f220> <<< 13040 1726882404.58582: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 13040 1726882404.58613: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd5b2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd58f940> <<< 13040 1726882404.58645: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd5f0880> <<< 13040 1726882404.58666: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd588d90> <<< 13040 1726882404.58731: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 13040 1726882404.58733: stdout chunk (state=3): >>>import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd5b2d90> <<< 13040 1726882404.58780: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd5d8970> <<< 13040 1726882404.58808: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 13040 1726882404.59008: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 13040 1726882404.59052: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 13040 1726882404.59094: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 13040 1726882404.59099: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 13040 1726882404.59125: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 13040 1726882404.59128: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd553eb0> <<< 13040 1726882404.59177: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd556f40> <<< 13040 1726882404.59208: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 13040 1726882404.59211: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 13040 1726882404.59238: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 13040 1726882404.59282: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 13040 1726882404.59304: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 13040 1726882404.59328: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd54c610> <<< 13040 1726882404.59332: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd552640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd553370> <<< 13040 1726882404.59348: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 13040 1726882404.59402: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 13040 1726882404.59426: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 13040 1726882404.59450: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 13040 1726882404.59479: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 13040 1726882404.59516: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd4d4e20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd4d4910> <<< 13040 1726882404.59527: stdout chunk (state=3): >>>import 'itertools' # <<< 13040 1726882404.59567: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd4d4f10> <<< 13040 1726882404.59589: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 13040 1726882404.59625: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd4d4fd0> <<< 13040 1726882404.59648: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd4e70d0> <<< 13040 1726882404.59659: stdout chunk (state=3): >>>import '_collections' # <<< 13040 1726882404.59706: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd52ed90> <<< 13040 1726882404.59719: stdout chunk (state=3): >>>import '_functools' # <<< 13040 1726882404.59737: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd527670> <<< 13040 1726882404.59790: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py <<< 13040 1726882404.59815: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd53a6d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd55ae20> <<< 13040 1726882404.59845: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 13040 1726882404.59857: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd4e7cd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd52e2b0> <<< 13040 1726882404.59893: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 13040 1726882404.59910: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd53a2e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd5609d0> <<< 13040 1726882404.59948: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 13040 1726882404.59967: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 13040 1726882404.59993: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 13040 1726882404.60005: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd4e7eb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd4e7df0> <<< 13040 1726882404.60044: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' <<< 13040 1726882404.60072: stdout chunk (state=3): >>>import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd4e7d60> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 13040 1726882404.60090: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 13040 1726882404.60115: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 13040 1726882404.60168: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 13040 1726882404.60199: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' <<< 13040 1726882404.60217: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd4ba3d0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 13040 1726882404.60228: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 13040 1726882404.60257: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd4ba4c0> <<< 13040 1726882404.60385: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd4eef40> <<< 13040 1726882404.60418: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd4e9a90> <<< 13040 1726882404.60435: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd4e9490> <<< 13040 1726882404.60451: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py <<< 13040 1726882404.60471: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 13040 1726882404.60499: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 13040 1726882404.60526: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 13040 1726882404.60538: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd3e3220> <<< 13040 1726882404.60572: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd4a5520> <<< 13040 1726882404.60621: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd4e9f10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd560040> <<< 13040 1726882404.60640: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 13040 1726882404.60675: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 13040 1726882404.60709: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd3f5b50> import 'errno' # <<< 13040 1726882404.60737: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' <<< 13040 1726882404.60760: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd3f5e80> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py <<< 13040 1726882404.60783: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 13040 1726882404.60804: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd406790> <<< 13040 1726882404.60825: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 13040 1726882404.60852: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 13040 1726882404.60885: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd406cd0> <<< 13040 1726882404.60937: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd394400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd3f5f70> <<< 13040 1726882404.60956: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 13040 1726882404.60999: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' <<< 13040 1726882404.61017: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd3a52e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd406610> import 'pwd' # <<< 13040 1726882404.61046: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd3a53a0> <<< 13040 1726882404.61082: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd4e7a30> <<< 13040 1726882404.61108: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 13040 1726882404.61119: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 13040 1726882404.61149: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 13040 1726882404.61162: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 13040 1726882404.61201: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd3c0700> <<< 13040 1726882404.61228: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 13040 1726882404.61257: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd3c09d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd3c07c0> <<< 13040 1726882404.61273: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd3c08b0> <<< 13040 1726882404.61296: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 13040 1726882404.61495: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd3c0d00> <<< 13040 1726882404.61535: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd3cb250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd3c0940> <<< 13040 1726882404.61549: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd3b4a90> <<< 13040 1726882404.61583: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd4e7610> <<< 13040 1726882404.61597: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 13040 1726882404.61650: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 13040 1726882404.61685: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd3c0af0> <<< 13040 1726882404.61784: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 13040 1726882404.61798: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fa9cd2e46d0> <<< 13040 1726882404.61931: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip' # zipimport: zlib available <<< 13040 1726882404.62023: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.62049: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/__init__.py <<< 13040 1726882404.62074: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13040 1726882404.62092: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/__init__.py <<< 13040 1726882404.62101: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.63322: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.64294: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd1e1820> <<< 13040 1726882404.64315: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 13040 1726882404.64335: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 13040 1726882404.64357: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd270730> <<< 13040 1726882404.64395: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd270610> <<< 13040 1726882404.64422: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd270340> <<< 13040 1726882404.64446: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 13040 1726882404.64503: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd270460> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd270160> import 'atexit' # <<< 13040 1726882404.64544: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd2703a0> <<< 13040 1726882404.64547: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 13040 1726882404.64576: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 13040 1726882404.64620: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd270790> <<< 13040 1726882404.64637: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 13040 1726882404.64671: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 13040 1726882404.64696: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 13040 1726882404.64708: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 13040 1726882404.64782: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd1617f0> <<< 13040 1726882404.64818: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd161b80> <<< 13040 1726882404.64850: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd1619d0> <<< 13040 1726882404.64862: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 13040 1726882404.64887: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 13040 1726882404.64939: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd180af0> <<< 13040 1726882404.64942: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd26ad60> <<< 13040 1726882404.65117: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd2704f0> <<< 13040 1726882404.65151: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 13040 1726882404.65159: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd26a1c0> <<< 13040 1726882404.65173: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 13040 1726882404.65204: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 13040 1726882404.65221: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 13040 1726882404.65257: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd1ddb20> <<< 13040 1726882404.65346: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd213eb0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd2138b0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd17a2e0> <<< 13040 1726882404.65379: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd2139a0> <<< 13040 1726882404.65403: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd241d00> <<< 13040 1726882404.65432: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 13040 1726882404.65435: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 13040 1726882404.65456: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 13040 1726882404.65484: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 13040 1726882404.65562: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd142a00> <<< 13040 1726882404.65619: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd249e80> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 13040 1726882404.65642: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd1510a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd249eb0> <<< 13040 1726882404.65664: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 13040 1726882404.65698: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 13040 1726882404.65729: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 13040 1726882404.65785: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd216730> <<< 13040 1726882404.65918: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd1510d0> <<< 13040 1726882404.66009: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd14e550> <<< 13040 1726882404.66040: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' <<< 13040 1726882404.66043: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd14e610> <<< 13040 1726882404.66094: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd14dc40> <<< 13040 1726882404.66098: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd241ee0> <<< 13040 1726882404.66122: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 13040 1726882404.66133: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 13040 1726882404.66187: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd1d2b50> <<< 13040 1726882404.66380: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd1d0940> <<< 13040 1726882404.66407: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd144820> <<< 13040 1726882404.66432: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd1d25b0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd20aaf0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py <<< 13040 1726882404.66451: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.66532: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.66624: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.66651: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py <<< 13040 1726882404.66668: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py <<< 13040 1726882404.66676: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.66769: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.66857: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.67299: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.67769: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py <<< 13040 1726882404.67792: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py <<< 13040 1726882404.67795: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 13040 1726882404.67842: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cc7f9df0> <<< 13040 1726882404.67921: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 13040 1726882404.67934: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9ccbe95b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9ccbd4df0> <<< 13040 1726882404.67987: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py <<< 13040 1726882404.68008: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.68020: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 13040 1726882404.68145: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.68280: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 13040 1726882404.68304: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd1c79d0> # zipimport: zlib available <<< 13040 1726882404.68686: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.69056: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.69107: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.69178: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available <<< 13040 1726882404.69210: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.69251: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py <<< 13040 1726882404.69254: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.69302: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.69393: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/errors.py <<< 13040 1726882404.69417: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.69420: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available <<< 13040 1726882404.69448: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.69492: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 13040 1726882404.69497: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.69679: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.69866: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 13040 1726882404.69903: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 13040 1726882404.69906: stdout chunk (state=3): >>>import '_ast' # <<< 13040 1726882404.69974: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cc7cae50> # zipimport: zlib available <<< 13040 1726882404.70033: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.70105: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py <<< 13040 1726882404.70132: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py <<< 13040 1726882404.70135: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.70172: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.70211: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/common/locale.py <<< 13040 1726882404.70215: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.70247: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.70286: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.70376: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.70434: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 13040 1726882404.70455: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 13040 1726882404.70527: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd25b910> <<< 13040 1726882404.70560: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cc7cabe0> <<< 13040 1726882404.70583: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/common/file.py <<< 13040 1726882404.70607: stdout chunk (state=3): >>>import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available <<< 13040 1726882404.70720: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.70771: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.70795: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.70836: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 13040 1726882404.70854: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 13040 1726882404.70865: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 13040 1726882404.70892: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 13040 1726882404.70919: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 13040 1726882404.70935: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 13040 1726882404.71015: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cc78cc70> <<< 13040 1726882404.71056: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9ccbdc670> <<< 13040 1726882404.71116: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9ccbdb850> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available <<< 13040 1726882404.71159: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.71174: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py <<< 13040 1726882404.71237: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/basic.py <<< 13040 1726882404.71271: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13040 1726882404.71283: stdout chunk (state=3): >>>import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/modules/__init__.py # zipimport: zlib available <<< 13040 1726882404.71393: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.71561: stdout chunk (state=3): >>># zipimport: zlib available <<< 13040 1726882404.71734: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 13040 1726882404.71950: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr <<< 13040 1726882404.71978: stdout chunk (state=3): >>># cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 <<< 13040 1726882404.71999: stdout chunk (state=3): >>># cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii <<< 13040 1726882404.72035: stdout chunk (state=3): >>># cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 <<< 13040 1726882404.72078: stdout chunk (state=3): >>># cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string <<< 13040 1726882404.72099: stdout chunk (state=3): >>># destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 13040 1726882404.72267: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 13040 1726882404.72299: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression <<< 13040 1726882404.72338: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings <<< 13040 1726882404.72361: stdout chunk (state=3): >>># destroy syslog # destroy uuid # destroy array # destroy datetime <<< 13040 1726882404.72389: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse <<< 13040 1726882404.72442: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon <<< 13040 1726882404.72514: stdout chunk (state=3): >>># cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch <<< 13040 1726882404.72577: stdout chunk (state=3): >>># cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale <<< 13040 1726882404.72628: stdout chunk (state=3): >>># destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 13040 1726882404.72645: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 13040 1726882404.72796: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize <<< 13040 1726882404.72844: stdout chunk (state=3): >>># destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 13040 1726882404.72862: stdout chunk (state=3): >>># destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 13040 1726882404.72886: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 13040 1726882404.73246: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 13040 1726882404.73255: stdout chunk (state=3): >>><<< 13040 1726882404.73269: stderr chunk (state=3): >>><<< 13040 1726882404.73352: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd843dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd5d83a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd843b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd843ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd5d8490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd5d8940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd5d8670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd58f190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd58f220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd5b2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd58f940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd5f0880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd588d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd5b2d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd5d8970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd553eb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd556f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd54c610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd552640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd553370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd4d4e20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd4d4910> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd4d4f10> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd4d4fd0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd4e70d0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd52ed90> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd527670> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd53a6d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd55ae20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd4e7cd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd52e2b0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd53a2e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd5609d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd4e7eb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd4e7df0> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd4e7d60> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd4ba3d0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd4ba4c0> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd4eef40> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd4e9a90> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd4e9490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd3e3220> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd4a5520> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd4e9f10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd560040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd3f5b50> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd3f5e80> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd406790> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd406cd0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd394400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd3f5f70> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd3a52e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd406610> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd3a53a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd4e7a30> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd3c0700> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd3c09d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd3c07c0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd3c08b0> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd3c0d00> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd3cb250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd3c0940> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd3b4a90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd4e7610> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd3c0af0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fa9cd2e46d0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd1e1820> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd270730> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd270610> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd270340> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd270460> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd270160> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd2703a0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd270790> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd1617f0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd161b80> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd1619d0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd180af0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd26ad60> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd2704f0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd26a1c0> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd1ddb20> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd213eb0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd2138b0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd17a2e0> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd2139a0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd241d00> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd142a00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd249e80> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd1510a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd249eb0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd216730> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd1510d0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd14e550> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd14e610> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd14dc40> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd241ee0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd1d2b50> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd1d0940> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd144820> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd1d25b0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd20aaf0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cc7f9df0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9ccbe95b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9ccbd4df0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cd1c79d0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cc7cae50> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9cd25b910> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cc7cabe0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9cc78cc70> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9ccbdc670> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9ccbdb850> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_6sleg73h/ansible_stat_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 13040 1726882404.73988: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882404.3578727-13151-119549827532451/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13040 1726882404.73992: _low_level_execute_command(): starting 13040 1726882404.73994: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882404.3578727-13151-119549827532451/ > /dev/null 2>&1 && sleep 0' 13040 1726882404.74607: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13040 1726882404.74623: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13040 1726882404.74645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13040 1726882404.74669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13040 1726882404.74712: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 13040 1726882404.74723: stderr chunk (state=3): >>>debug2: match not found <<< 13040 1726882404.74737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13040 1726882404.74759: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13040 1726882404.74773: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 13040 1726882404.74784: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13040 1726882404.74795: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13040 1726882404.74807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13040 1726882404.74821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13040 1726882404.74832: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 13040 1726882404.74842: stderr chunk (state=3): >>>debug2: match found <<< 13040 1726882404.74854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13040 1726882404.74940: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13040 1726882404.74963: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13040 1726882404.74986: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13040 1726882404.75113: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13040 1726882404.76928: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13040 1726882404.77010: stderr chunk (state=3): >>><<< 13040 1726882404.77013: stdout chunk (state=3): >>><<< 13040 1726882404.77472: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13040 1726882404.77476: handler run complete 13040 1726882404.77479: attempt loop complete, returning result 13040 1726882404.77481: _execute() done 13040 1726882404.77483: dumping result to json 13040 1726882404.77485: done dumping result, returning 13040 1726882404.77487: done running TaskExecutor() for managed_node1/TASK: Check if system is ostree [0e448fcc-3ce9-b123-314b-0000000001cf] 13040 1726882404.77489: sending task result for task 0e448fcc-3ce9-b123-314b-0000000001cf 13040 1726882404.77560: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000001cf 13040 1726882404.77565: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 13040 1726882404.77625: no more pending results, returning what we have 13040 1726882404.77628: results queue empty 13040 1726882404.77629: checking for any_errors_fatal 13040 1726882404.77638: done checking for any_errors_fatal 13040 1726882404.77639: checking for max_fail_percentage 13040 1726882404.77641: done checking for max_fail_percentage 13040 1726882404.77641: checking to see if all hosts have failed and the running result is not ok 13040 1726882404.77642: done checking to see if all hosts have failed 13040 1726882404.77643: getting the remaining hosts for this loop 13040 1726882404.77644: done getting the remaining hosts for this loop 13040 1726882404.77648: getting the next task for host managed_node1 13040 1726882404.77656: done getting next task for host managed_node1 13040 1726882404.77659: ^ task is: TASK: Set flag to indicate system is ostree 13040 1726882404.77661: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882404.77667: getting variables 13040 1726882404.77668: in VariableManager get_vars() 13040 1726882404.77695: Calling all_inventory to load vars for managed_node1 13040 1726882404.77698: Calling groups_inventory to load vars for managed_node1 13040 1726882404.77702: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882404.77712: Calling all_plugins_play to load vars for managed_node1 13040 1726882404.77714: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882404.77717: Calling groups_plugins_play to load vars for managed_node1 13040 1726882404.77906: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882404.78126: done with get_vars() 13040 1726882404.78137: done getting variables 13040 1726882404.78296: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 21:33:24 -0400 (0:00:00.467) 0:00:02.260 ****** 13040 1726882404.78328: entering _queue_task() for managed_node1/set_fact 13040 1726882404.78330: Creating lock for set_fact 13040 1726882404.78610: worker is 1 (out of 1 available) 13040 1726882404.78621: exiting _queue_task() for managed_node1/set_fact 13040 1726882404.78634: done queuing things up, now waiting for results queue to drain 13040 1726882404.78636: waiting for pending results... 13040 1726882404.78897: running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree 13040 1726882404.79014: in run() - task 0e448fcc-3ce9-b123-314b-0000000001d0 13040 1726882404.79036: variable 'ansible_search_path' from source: unknown 13040 1726882404.79042: variable 'ansible_search_path' from source: unknown 13040 1726882404.79090: calling self._execute() 13040 1726882404.79157: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882404.79161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882404.79171: variable 'omit' from source: magic vars 13040 1726882404.79449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13040 1726882404.79717: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13040 1726882404.79754: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13040 1726882404.79782: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13040 1726882404.79808: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13040 1726882404.79873: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13040 1726882404.79891: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13040 1726882404.79911: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882404.79929: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13040 1726882404.80016: Evaluated conditional (not __network_is_ostree is defined): True 13040 1726882404.80021: variable 'omit' from source: magic vars 13040 1726882404.80046: variable 'omit' from source: magic vars 13040 1726882404.80127: variable '__ostree_booted_stat' from source: set_fact 13040 1726882404.80164: variable 'omit' from source: magic vars 13040 1726882404.80185: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13040 1726882404.80205: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13040 1726882404.80220: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13040 1726882404.80233: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13040 1726882404.80241: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13040 1726882404.80269: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13040 1726882404.80272: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882404.80275: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882404.80336: Set connection var ansible_shell_executable to /bin/sh 13040 1726882404.80339: Set connection var ansible_timeout to 10 13040 1726882404.80347: Set connection var ansible_pipelining to False 13040 1726882404.80353: Set connection var ansible_shell_type to sh 13040 1726882404.80358: Set connection var ansible_connection to ssh 13040 1726882404.80365: Set connection var ansible_module_compression to ZIP_DEFLATED 13040 1726882404.80381: variable 'ansible_shell_executable' from source: unknown 13040 1726882404.80384: variable 'ansible_connection' from source: unknown 13040 1726882404.80387: variable 'ansible_module_compression' from source: unknown 13040 1726882404.80389: variable 'ansible_shell_type' from source: unknown 13040 1726882404.80391: variable 'ansible_shell_executable' from source: unknown 13040 1726882404.80394: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882404.80398: variable 'ansible_pipelining' from source: unknown 13040 1726882404.80400: variable 'ansible_timeout' from source: unknown 13040 1726882404.80404: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882404.80481: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13040 1726882404.80491: variable 'omit' from source: magic vars 13040 1726882404.80497: starting attempt loop 13040 1726882404.80499: running the handler 13040 1726882404.80511: handler run complete 13040 1726882404.80520: attempt loop complete, returning result 13040 1726882404.80522: _execute() done 13040 1726882404.80525: dumping result to json 13040 1726882404.80527: done dumping result, returning 13040 1726882404.80534: done running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree [0e448fcc-3ce9-b123-314b-0000000001d0] 13040 1726882404.80540: sending task result for task 0e448fcc-3ce9-b123-314b-0000000001d0 ok: [managed_node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 13040 1726882404.80681: no more pending results, returning what we have 13040 1726882404.80684: results queue empty 13040 1726882404.80685: checking for any_errors_fatal 13040 1726882404.80690: done checking for any_errors_fatal 13040 1726882404.80691: checking for max_fail_percentage 13040 1726882404.80693: done checking for max_fail_percentage 13040 1726882404.80693: checking to see if all hosts have failed and the running result is not ok 13040 1726882404.80694: done checking to see if all hosts have failed 13040 1726882404.80695: getting the remaining hosts for this loop 13040 1726882404.80696: done getting the remaining hosts for this loop 13040 1726882404.80700: getting the next task for host managed_node1 13040 1726882404.80709: done getting next task for host managed_node1 13040 1726882404.80711: ^ task is: TASK: Fix CentOS6 Base repo 13040 1726882404.80713: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882404.80717: getting variables 13040 1726882404.80718: in VariableManager get_vars() 13040 1726882404.80743: Calling all_inventory to load vars for managed_node1 13040 1726882404.80746: Calling groups_inventory to load vars for managed_node1 13040 1726882404.80748: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882404.80760: Calling all_plugins_play to load vars for managed_node1 13040 1726882404.80762: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882404.80770: Calling groups_plugins_play to load vars for managed_node1 13040 1726882404.80931: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000001d0 13040 1726882404.80949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882404.81169: done with get_vars() 13040 1726882404.81176: done getting variables 13040 1726882404.81201: WORKER PROCESS EXITING 13040 1726882404.81271: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 21:33:24 -0400 (0:00:00.029) 0:00:02.290 ****** 13040 1726882404.81290: entering _queue_task() for managed_node1/copy 13040 1726882404.81473: worker is 1 (out of 1 available) 13040 1726882404.81485: exiting _queue_task() for managed_node1/copy 13040 1726882404.81497: done queuing things up, now waiting for results queue to drain 13040 1726882404.81498: waiting for pending results... 13040 1726882404.81639: running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo 13040 1726882404.81702: in run() - task 0e448fcc-3ce9-b123-314b-0000000001d2 13040 1726882404.81724: variable 'ansible_search_path' from source: unknown 13040 1726882404.81733: variable 'ansible_search_path' from source: unknown 13040 1726882404.81779: calling self._execute() 13040 1726882404.81855: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882404.81873: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882404.81886: variable 'omit' from source: magic vars 13040 1726882404.82230: variable 'ansible_distribution' from source: facts 13040 1726882404.82255: Evaluated conditional (ansible_distribution == 'CentOS'): True 13040 1726882404.82368: variable 'ansible_distribution_major_version' from source: facts 13040 1726882404.82378: Evaluated conditional (ansible_distribution_major_version == '6'): False 13040 1726882404.82384: when evaluation is False, skipping this task 13040 1726882404.82389: _execute() done 13040 1726882404.82394: dumping result to json 13040 1726882404.82401: done dumping result, returning 13040 1726882404.82412: done running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo [0e448fcc-3ce9-b123-314b-0000000001d2] 13040 1726882404.82420: sending task result for task 0e448fcc-3ce9-b123-314b-0000000001d2 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 13040 1726882404.82582: no more pending results, returning what we have 13040 1726882404.82585: results queue empty 13040 1726882404.82586: checking for any_errors_fatal 13040 1726882404.82592: done checking for any_errors_fatal 13040 1726882404.82592: checking for max_fail_percentage 13040 1726882404.82594: done checking for max_fail_percentage 13040 1726882404.82594: checking to see if all hosts have failed and the running result is not ok 13040 1726882404.82595: done checking to see if all hosts have failed 13040 1726882404.82596: getting the remaining hosts for this loop 13040 1726882404.82597: done getting the remaining hosts for this loop 13040 1726882404.82600: getting the next task for host managed_node1 13040 1726882404.82606: done getting next task for host managed_node1 13040 1726882404.82609: ^ task is: TASK: Include the task 'enable_epel.yml' 13040 1726882404.82611: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882404.82615: getting variables 13040 1726882404.82616: in VariableManager get_vars() 13040 1726882404.82644: Calling all_inventory to load vars for managed_node1 13040 1726882404.82646: Calling groups_inventory to load vars for managed_node1 13040 1726882404.82650: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882404.82668: Calling all_plugins_play to load vars for managed_node1 13040 1726882404.82671: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882404.82674: Calling groups_plugins_play to load vars for managed_node1 13040 1726882404.82796: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000001d2 13040 1726882404.82799: WORKER PROCESS EXITING 13040 1726882404.82808: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882404.82926: done with get_vars() 13040 1726882404.82933: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 21:33:24 -0400 (0:00:00.017) 0:00:02.307 ****** 13040 1726882404.82998: entering _queue_task() for managed_node1/include_tasks 13040 1726882404.83177: worker is 1 (out of 1 available) 13040 1726882404.83190: exiting _queue_task() for managed_node1/include_tasks 13040 1726882404.83200: done queuing things up, now waiting for results queue to drain 13040 1726882404.83202: waiting for pending results... 13040 1726882404.83339: running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' 13040 1726882404.83406: in run() - task 0e448fcc-3ce9-b123-314b-0000000001d3 13040 1726882404.83415: variable 'ansible_search_path' from source: unknown 13040 1726882404.83418: variable 'ansible_search_path' from source: unknown 13040 1726882404.83445: calling self._execute() 13040 1726882404.83502: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882404.83506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882404.83513: variable 'omit' from source: magic vars 13040 1726882404.83830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882404.85346: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882404.85390: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882404.85417: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882404.85442: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882404.85465: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882404.85529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882404.85549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882404.85571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882404.85598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882404.85609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882404.85692: variable '__network_is_ostree' from source: set_fact 13040 1726882404.85706: Evaluated conditional (not __network_is_ostree | d(false)): True 13040 1726882404.85712: _execute() done 13040 1726882404.85715: dumping result to json 13040 1726882404.85717: done dumping result, returning 13040 1726882404.85723: done running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' [0e448fcc-3ce9-b123-314b-0000000001d3] 13040 1726882404.85728: sending task result for task 0e448fcc-3ce9-b123-314b-0000000001d3 13040 1726882404.85812: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000001d3 13040 1726882404.85815: WORKER PROCESS EXITING 13040 1726882404.85840: no more pending results, returning what we have 13040 1726882404.85845: in VariableManager get_vars() 13040 1726882404.85882: Calling all_inventory to load vars for managed_node1 13040 1726882404.85885: Calling groups_inventory to load vars for managed_node1 13040 1726882404.85889: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882404.85900: Calling all_plugins_play to load vars for managed_node1 13040 1726882404.85903: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882404.85906: Calling groups_plugins_play to load vars for managed_node1 13040 1726882404.86074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882404.86279: done with get_vars() 13040 1726882404.86286: variable 'ansible_search_path' from source: unknown 13040 1726882404.86288: variable 'ansible_search_path' from source: unknown 13040 1726882404.86339: we have included files to process 13040 1726882404.86341: generating all_blocks data 13040 1726882404.86342: done generating all_blocks data 13040 1726882404.86346: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 13040 1726882404.86347: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 13040 1726882404.86349: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 13040 1726882404.87058: done processing included file 13040 1726882404.87061: iterating over new_blocks loaded from include file 13040 1726882404.87062: in VariableManager get_vars() 13040 1726882404.87091: done with get_vars() 13040 1726882404.87093: filtering new block on tags 13040 1726882404.87150: done filtering new block on tags 13040 1726882404.87155: in VariableManager get_vars() 13040 1726882404.87174: done with get_vars() 13040 1726882404.87175: filtering new block on tags 13040 1726882404.87202: done filtering new block on tags 13040 1726882404.87204: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node1 13040 1726882404.87215: extending task lists for all hosts with included blocks 13040 1726882404.87344: done extending task lists 13040 1726882404.87345: done processing included files 13040 1726882404.87345: results queue empty 13040 1726882404.87346: checking for any_errors_fatal 13040 1726882404.87348: done checking for any_errors_fatal 13040 1726882404.87349: checking for max_fail_percentage 13040 1726882404.87349: done checking for max_fail_percentage 13040 1726882404.87350: checking to see if all hosts have failed and the running result is not ok 13040 1726882404.87350: done checking to see if all hosts have failed 13040 1726882404.87351: getting the remaining hosts for this loop 13040 1726882404.87353: done getting the remaining hosts for this loop 13040 1726882404.87354: getting the next task for host managed_node1 13040 1726882404.87357: done getting next task for host managed_node1 13040 1726882404.87359: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 13040 1726882404.87361: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882404.87362: getting variables 13040 1726882404.87363: in VariableManager get_vars() 13040 1726882404.87371: Calling all_inventory to load vars for managed_node1 13040 1726882404.87372: Calling groups_inventory to load vars for managed_node1 13040 1726882404.87374: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882404.87378: Calling all_plugins_play to load vars for managed_node1 13040 1726882404.87383: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882404.87384: Calling groups_plugins_play to load vars for managed_node1 13040 1726882404.87503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882404.87611: done with get_vars() 13040 1726882404.87621: done getting variables 13040 1726882404.87671: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 13040 1726882404.87754: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 9] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 21:33:24 -0400 (0:00:00.047) 0:00:02.355 ****** 13040 1726882404.87791: entering _queue_task() for managed_node1/command 13040 1726882404.87792: Creating lock for command 13040 1726882404.87998: worker is 1 (out of 1 available) 13040 1726882404.88010: exiting _queue_task() for managed_node1/command 13040 1726882404.88022: done queuing things up, now waiting for results queue to drain 13040 1726882404.88023: waiting for pending results... 13040 1726882404.88184: running TaskExecutor() for managed_node1/TASK: Create EPEL 9 13040 1726882404.88471: in run() - task 0e448fcc-3ce9-b123-314b-0000000001ed 13040 1726882404.88476: variable 'ansible_search_path' from source: unknown 13040 1726882404.88478: variable 'ansible_search_path' from source: unknown 13040 1726882404.88481: calling self._execute() 13040 1726882404.88484: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882404.88486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882404.88488: variable 'omit' from source: magic vars 13040 1726882404.88728: variable 'ansible_distribution' from source: facts 13040 1726882404.88737: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 13040 1726882404.88854: variable 'ansible_distribution_major_version' from source: facts 13040 1726882404.88862: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 13040 1726882404.88867: when evaluation is False, skipping this task 13040 1726882404.88871: _execute() done 13040 1726882404.88873: dumping result to json 13040 1726882404.88876: done dumping result, returning 13040 1726882404.88881: done running TaskExecutor() for managed_node1/TASK: Create EPEL 9 [0e448fcc-3ce9-b123-314b-0000000001ed] 13040 1726882404.88887: sending task result for task 0e448fcc-3ce9-b123-314b-0000000001ed 13040 1726882404.88982: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000001ed 13040 1726882404.88985: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 13040 1726882404.89060: no more pending results, returning what we have 13040 1726882404.89065: results queue empty 13040 1726882404.89066: checking for any_errors_fatal 13040 1726882404.89068: done checking for any_errors_fatal 13040 1726882404.89068: checking for max_fail_percentage 13040 1726882404.89070: done checking for max_fail_percentage 13040 1726882404.89070: checking to see if all hosts have failed and the running result is not ok 13040 1726882404.89071: done checking to see if all hosts have failed 13040 1726882404.89072: getting the remaining hosts for this loop 13040 1726882404.89073: done getting the remaining hosts for this loop 13040 1726882404.89076: getting the next task for host managed_node1 13040 1726882404.89082: done getting next task for host managed_node1 13040 1726882404.89084: ^ task is: TASK: Install yum-utils package 13040 1726882404.89087: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882404.89090: getting variables 13040 1726882404.89091: in VariableManager get_vars() 13040 1726882404.89114: Calling all_inventory to load vars for managed_node1 13040 1726882404.89137: Calling groups_inventory to load vars for managed_node1 13040 1726882404.89141: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882404.89150: Calling all_plugins_play to load vars for managed_node1 13040 1726882404.89154: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882404.89157: Calling groups_plugins_play to load vars for managed_node1 13040 1726882404.89320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882404.89581: done with get_vars() 13040 1726882404.89590: done getting variables 13040 1726882404.89717: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 21:33:24 -0400 (0:00:00.019) 0:00:02.374 ****** 13040 1726882404.89743: entering _queue_task() for managed_node1/package 13040 1726882404.89745: Creating lock for package 13040 1726882404.90212: worker is 1 (out of 1 available) 13040 1726882404.90224: exiting _queue_task() for managed_node1/package 13040 1726882404.90235: done queuing things up, now waiting for results queue to drain 13040 1726882404.90236: waiting for pending results... 13040 1726882404.90489: running TaskExecutor() for managed_node1/TASK: Install yum-utils package 13040 1726882404.90591: in run() - task 0e448fcc-3ce9-b123-314b-0000000001ee 13040 1726882404.90610: variable 'ansible_search_path' from source: unknown 13040 1726882404.90613: variable 'ansible_search_path' from source: unknown 13040 1726882404.90644: calling self._execute() 13040 1726882404.90750: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882404.90753: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882404.90767: variable 'omit' from source: magic vars 13040 1726882404.91185: variable 'ansible_distribution' from source: facts 13040 1726882404.91196: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 13040 1726882404.91318: variable 'ansible_distribution_major_version' from source: facts 13040 1726882404.91323: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 13040 1726882404.91331: when evaluation is False, skipping this task 13040 1726882404.91335: _execute() done 13040 1726882404.91337: dumping result to json 13040 1726882404.91340: done dumping result, returning 13040 1726882404.91348: done running TaskExecutor() for managed_node1/TASK: Install yum-utils package [0e448fcc-3ce9-b123-314b-0000000001ee] 13040 1726882404.91357: sending task result for task 0e448fcc-3ce9-b123-314b-0000000001ee skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 13040 1726882404.91515: no more pending results, returning what we have 13040 1726882404.91519: results queue empty 13040 1726882404.91519: checking for any_errors_fatal 13040 1726882404.91528: done checking for any_errors_fatal 13040 1726882404.91529: checking for max_fail_percentage 13040 1726882404.91530: done checking for max_fail_percentage 13040 1726882404.91531: checking to see if all hosts have failed and the running result is not ok 13040 1726882404.91532: done checking to see if all hosts have failed 13040 1726882404.91533: getting the remaining hosts for this loop 13040 1726882404.91534: done getting the remaining hosts for this loop 13040 1726882404.91537: getting the next task for host managed_node1 13040 1726882404.91545: done getting next task for host managed_node1 13040 1726882404.91547: ^ task is: TASK: Enable EPEL 7 13040 1726882404.91554: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882404.91557: getting variables 13040 1726882404.91558: in VariableManager get_vars() 13040 1726882404.91588: Calling all_inventory to load vars for managed_node1 13040 1726882404.91590: Calling groups_inventory to load vars for managed_node1 13040 1726882404.91594: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882404.91609: Calling all_plugins_play to load vars for managed_node1 13040 1726882404.91612: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882404.91616: Calling groups_plugins_play to load vars for managed_node1 13040 1726882404.91785: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000001ee 13040 1726882404.91791: WORKER PROCESS EXITING 13040 1726882404.91805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882404.92000: done with get_vars() 13040 1726882404.92010: done getting variables 13040 1726882404.92085: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 21:33:24 -0400 (0:00:00.023) 0:00:02.398 ****** 13040 1726882404.92125: entering _queue_task() for managed_node1/command 13040 1726882404.92567: worker is 1 (out of 1 available) 13040 1726882404.92579: exiting _queue_task() for managed_node1/command 13040 1726882404.92592: done queuing things up, now waiting for results queue to drain 13040 1726882404.92593: waiting for pending results... 13040 1726882404.92962: running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 13040 1726882404.93074: in run() - task 0e448fcc-3ce9-b123-314b-0000000001ef 13040 1726882404.93083: variable 'ansible_search_path' from source: unknown 13040 1726882404.93086: variable 'ansible_search_path' from source: unknown 13040 1726882404.93128: calling self._execute() 13040 1726882404.93197: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882404.93201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882404.93209: variable 'omit' from source: magic vars 13040 1726882404.93491: variable 'ansible_distribution' from source: facts 13040 1726882404.93501: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 13040 1726882404.93590: variable 'ansible_distribution_major_version' from source: facts 13040 1726882404.93595: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 13040 1726882404.93598: when evaluation is False, skipping this task 13040 1726882404.93601: _execute() done 13040 1726882404.93604: dumping result to json 13040 1726882404.93607: done dumping result, returning 13040 1726882404.93613: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 [0e448fcc-3ce9-b123-314b-0000000001ef] 13040 1726882404.93619: sending task result for task 0e448fcc-3ce9-b123-314b-0000000001ef 13040 1726882404.93707: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000001ef 13040 1726882404.93710: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 13040 1726882404.93757: no more pending results, returning what we have 13040 1726882404.93760: results queue empty 13040 1726882404.93761: checking for any_errors_fatal 13040 1726882404.93769: done checking for any_errors_fatal 13040 1726882404.93770: checking for max_fail_percentage 13040 1726882404.93772: done checking for max_fail_percentage 13040 1726882404.93773: checking to see if all hosts have failed and the running result is not ok 13040 1726882404.93773: done checking to see if all hosts have failed 13040 1726882404.93774: getting the remaining hosts for this loop 13040 1726882404.93775: done getting the remaining hosts for this loop 13040 1726882404.93779: getting the next task for host managed_node1 13040 1726882404.93785: done getting next task for host managed_node1 13040 1726882404.93788: ^ task is: TASK: Enable EPEL 8 13040 1726882404.93791: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882404.93795: getting variables 13040 1726882404.93796: in VariableManager get_vars() 13040 1726882404.93824: Calling all_inventory to load vars for managed_node1 13040 1726882404.93826: Calling groups_inventory to load vars for managed_node1 13040 1726882404.93829: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882404.93839: Calling all_plugins_play to load vars for managed_node1 13040 1726882404.93841: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882404.93844: Calling groups_plugins_play to load vars for managed_node1 13040 1726882404.93992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882404.94106: done with get_vars() 13040 1726882404.94113: done getting variables 13040 1726882404.94153: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 21:33:24 -0400 (0:00:00.020) 0:00:02.418 ****** 13040 1726882404.94175: entering _queue_task() for managed_node1/command 13040 1726882404.94357: worker is 1 (out of 1 available) 13040 1726882404.94372: exiting _queue_task() for managed_node1/command 13040 1726882404.94384: done queuing things up, now waiting for results queue to drain 13040 1726882404.94385: waiting for pending results... 13040 1726882404.94533: running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 13040 1726882404.94601: in run() - task 0e448fcc-3ce9-b123-314b-0000000001f0 13040 1726882404.94612: variable 'ansible_search_path' from source: unknown 13040 1726882404.94616: variable 'ansible_search_path' from source: unknown 13040 1726882404.94642: calling self._execute() 13040 1726882404.94732: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882404.94735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882404.94737: variable 'omit' from source: magic vars 13040 1726882404.95129: variable 'ansible_distribution' from source: facts 13040 1726882404.95140: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 13040 1726882404.95275: variable 'ansible_distribution_major_version' from source: facts 13040 1726882404.95282: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 13040 1726882404.95285: when evaluation is False, skipping this task 13040 1726882404.95288: _execute() done 13040 1726882404.95291: dumping result to json 13040 1726882404.95293: done dumping result, returning 13040 1726882404.95299: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 [0e448fcc-3ce9-b123-314b-0000000001f0] 13040 1726882404.95305: sending task result for task 0e448fcc-3ce9-b123-314b-0000000001f0 13040 1726882404.95392: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000001f0 13040 1726882404.95395: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 13040 1726882404.95439: no more pending results, returning what we have 13040 1726882404.95442: results queue empty 13040 1726882404.95443: checking for any_errors_fatal 13040 1726882404.95446: done checking for any_errors_fatal 13040 1726882404.95447: checking for max_fail_percentage 13040 1726882404.95449: done checking for max_fail_percentage 13040 1726882404.95449: checking to see if all hosts have failed and the running result is not ok 13040 1726882404.95450: done checking to see if all hosts have failed 13040 1726882404.95451: getting the remaining hosts for this loop 13040 1726882404.95453: done getting the remaining hosts for this loop 13040 1726882404.95456: getting the next task for host managed_node1 13040 1726882404.95466: done getting next task for host managed_node1 13040 1726882404.95469: ^ task is: TASK: Enable EPEL 6 13040 1726882404.95472: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882404.95475: getting variables 13040 1726882404.95476: in VariableManager get_vars() 13040 1726882404.95500: Calling all_inventory to load vars for managed_node1 13040 1726882404.95502: Calling groups_inventory to load vars for managed_node1 13040 1726882404.95506: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882404.95515: Calling all_plugins_play to load vars for managed_node1 13040 1726882404.95518: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882404.95521: Calling groups_plugins_play to load vars for managed_node1 13040 1726882404.95690: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882404.95913: done with get_vars() 13040 1726882404.95973: done getting variables 13040 1726882404.96038: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 21:33:24 -0400 (0:00:00.018) 0:00:02.437 ****** 13040 1726882404.96078: entering _queue_task() for managed_node1/copy 13040 1726882404.96283: worker is 1 (out of 1 available) 13040 1726882404.96303: exiting _queue_task() for managed_node1/copy 13040 1726882404.96315: done queuing things up, now waiting for results queue to drain 13040 1726882404.96317: waiting for pending results... 13040 1726882404.96458: running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 13040 1726882404.96526: in run() - task 0e448fcc-3ce9-b123-314b-0000000001f2 13040 1726882404.96534: variable 'ansible_search_path' from source: unknown 13040 1726882404.96539: variable 'ansible_search_path' from source: unknown 13040 1726882404.96568: calling self._execute() 13040 1726882404.96622: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882404.96625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882404.96634: variable 'omit' from source: magic vars 13040 1726882404.96900: variable 'ansible_distribution' from source: facts 13040 1726882404.96910: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 13040 1726882404.96988: variable 'ansible_distribution_major_version' from source: facts 13040 1726882404.96992: Evaluated conditional (ansible_distribution_major_version == '6'): False 13040 1726882404.96995: when evaluation is False, skipping this task 13040 1726882404.97000: _execute() done 13040 1726882404.97002: dumping result to json 13040 1726882404.97005: done dumping result, returning 13040 1726882404.97011: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 [0e448fcc-3ce9-b123-314b-0000000001f2] 13040 1726882404.97016: sending task result for task 0e448fcc-3ce9-b123-314b-0000000001f2 13040 1726882404.97101: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000001f2 13040 1726882404.97105: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 13040 1726882404.97149: no more pending results, returning what we have 13040 1726882404.97153: results queue empty 13040 1726882404.97153: checking for any_errors_fatal 13040 1726882404.97157: done checking for any_errors_fatal 13040 1726882404.97158: checking for max_fail_percentage 13040 1726882404.97160: done checking for max_fail_percentage 13040 1726882404.97160: checking to see if all hosts have failed and the running result is not ok 13040 1726882404.97161: done checking to see if all hosts have failed 13040 1726882404.97162: getting the remaining hosts for this loop 13040 1726882404.97164: done getting the remaining hosts for this loop 13040 1726882404.97168: getting the next task for host managed_node1 13040 1726882404.97175: done getting next task for host managed_node1 13040 1726882404.97178: ^ task is: TASK: Set network provider to 'initscripts' 13040 1726882404.97180: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882404.97184: getting variables 13040 1726882404.97185: in VariableManager get_vars() 13040 1726882404.97207: Calling all_inventory to load vars for managed_node1 13040 1726882404.97209: Calling groups_inventory to load vars for managed_node1 13040 1726882404.97212: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882404.97222: Calling all_plugins_play to load vars for managed_node1 13040 1726882404.97223: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882404.97225: Calling groups_plugins_play to load vars for managed_node1 13040 1726882404.97357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882404.97468: done with get_vars() 13040 1726882404.97474: done getting variables 13040 1726882404.97512: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'initscripts'] *********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_initscripts.yml:12 Friday 20 September 2024 21:33:24 -0400 (0:00:00.014) 0:00:02.452 ****** 13040 1726882404.97530: entering _queue_task() for managed_node1/set_fact 13040 1726882404.97696: worker is 1 (out of 1 available) 13040 1726882404.97710: exiting _queue_task() for managed_node1/set_fact 13040 1726882404.97723: done queuing things up, now waiting for results queue to drain 13040 1726882404.97724: waiting for pending results... 13040 1726882404.97875: running TaskExecutor() for managed_node1/TASK: Set network provider to 'initscripts' 13040 1726882404.97930: in run() - task 0e448fcc-3ce9-b123-314b-000000000007 13040 1726882404.97940: variable 'ansible_search_path' from source: unknown 13040 1726882404.97971: calling self._execute() 13040 1726882404.98032: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882404.98036: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882404.98083: variable 'omit' from source: magic vars 13040 1726882404.98149: variable 'omit' from source: magic vars 13040 1726882404.98205: variable 'omit' from source: magic vars 13040 1726882404.98244: variable 'omit' from source: magic vars 13040 1726882404.98307: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13040 1726882404.98370: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13040 1726882404.98399: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13040 1726882404.98418: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13040 1726882404.98449: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13040 1726882404.98488: variable 'inventory_hostname' from source: host vars for 'managed_node1' 13040 1726882404.98496: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882404.98503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882404.98623: Set connection var ansible_shell_executable to /bin/sh 13040 1726882404.98636: Set connection var ansible_timeout to 10 13040 1726882404.98658: Set connection var ansible_pipelining to False 13040 1726882404.98679: Set connection var ansible_shell_type to sh 13040 1726882404.98687: Set connection var ansible_connection to ssh 13040 1726882404.98697: Set connection var ansible_module_compression to ZIP_DEFLATED 13040 1726882404.98722: variable 'ansible_shell_executable' from source: unknown 13040 1726882404.98729: variable 'ansible_connection' from source: unknown 13040 1726882404.98736: variable 'ansible_module_compression' from source: unknown 13040 1726882404.98742: variable 'ansible_shell_type' from source: unknown 13040 1726882404.98748: variable 'ansible_shell_executable' from source: unknown 13040 1726882404.98763: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882404.98772: variable 'ansible_pipelining' from source: unknown 13040 1726882404.98778: variable 'ansible_timeout' from source: unknown 13040 1726882404.98785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882404.98925: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13040 1726882404.98941: variable 'omit' from source: magic vars 13040 1726882404.98951: starting attempt loop 13040 1726882404.98961: running the handler 13040 1726882404.98984: handler run complete 13040 1726882404.99002: attempt loop complete, returning result 13040 1726882404.99009: _execute() done 13040 1726882404.99015: dumping result to json 13040 1726882404.99022: done dumping result, returning 13040 1726882404.99033: done running TaskExecutor() for managed_node1/TASK: Set network provider to 'initscripts' [0e448fcc-3ce9-b123-314b-000000000007] 13040 1726882404.99044: sending task result for task 0e448fcc-3ce9-b123-314b-000000000007 ok: [managed_node1] => { "ansible_facts": { "network_provider": "initscripts" }, "changed": false } 13040 1726882404.99207: no more pending results, returning what we have 13040 1726882404.99210: results queue empty 13040 1726882404.99211: checking for any_errors_fatal 13040 1726882404.99219: done checking for any_errors_fatal 13040 1726882404.99220: checking for max_fail_percentage 13040 1726882404.99222: done checking for max_fail_percentage 13040 1726882404.99223: checking to see if all hosts have failed and the running result is not ok 13040 1726882404.99224: done checking to see if all hosts have failed 13040 1726882404.99225: getting the remaining hosts for this loop 13040 1726882404.99226: done getting the remaining hosts for this loop 13040 1726882404.99230: getting the next task for host managed_node1 13040 1726882404.99237: done getting next task for host managed_node1 13040 1726882404.99240: ^ task is: TASK: meta (flush_handlers) 13040 1726882404.99241: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882404.99246: getting variables 13040 1726882404.99248: in VariableManager get_vars() 13040 1726882404.99285: Calling all_inventory to load vars for managed_node1 13040 1726882404.99288: Calling groups_inventory to load vars for managed_node1 13040 1726882404.99291: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882404.99304: Calling all_plugins_play to load vars for managed_node1 13040 1726882404.99307: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882404.99309: Calling groups_plugins_play to load vars for managed_node1 13040 1726882404.99495: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882404.99792: done with get_vars() 13040 1726882404.99838: done getting variables 13040 1726882404.99874: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000007 13040 1726882404.99876: WORKER PROCESS EXITING 13040 1726882404.99918: in VariableManager get_vars() 13040 1726882404.99927: Calling all_inventory to load vars for managed_node1 13040 1726882404.99929: Calling groups_inventory to load vars for managed_node1 13040 1726882404.99930: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882404.99934: Calling all_plugins_play to load vars for managed_node1 13040 1726882404.99936: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882404.99938: Calling groups_plugins_play to load vars for managed_node1 13040 1726882405.00073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882405.00216: done with get_vars() 13040 1726882405.00225: done queuing things up, now waiting for results queue to drain 13040 1726882405.00226: results queue empty 13040 1726882405.00227: checking for any_errors_fatal 13040 1726882405.00229: done checking for any_errors_fatal 13040 1726882405.00229: checking for max_fail_percentage 13040 1726882405.00230: done checking for max_fail_percentage 13040 1726882405.00231: checking to see if all hosts have failed and the running result is not ok 13040 1726882405.00231: done checking to see if all hosts have failed 13040 1726882405.00232: getting the remaining hosts for this loop 13040 1726882405.00233: done getting the remaining hosts for this loop 13040 1726882405.00234: getting the next task for host managed_node1 13040 1726882405.00237: done getting next task for host managed_node1 13040 1726882405.00238: ^ task is: TASK: meta (flush_handlers) 13040 1726882405.00239: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882405.00245: getting variables 13040 1726882405.00245: in VariableManager get_vars() 13040 1726882405.00250: Calling all_inventory to load vars for managed_node1 13040 1726882405.00252: Calling groups_inventory to load vars for managed_node1 13040 1726882405.00254: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882405.00257: Calling all_plugins_play to load vars for managed_node1 13040 1726882405.00258: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882405.00260: Calling groups_plugins_play to load vars for managed_node1 13040 1726882405.00347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882405.00450: done with get_vars() 13040 1726882405.00457: done getting variables 13040 1726882405.00488: in VariableManager get_vars() 13040 1726882405.00494: Calling all_inventory to load vars for managed_node1 13040 1726882405.00495: Calling groups_inventory to load vars for managed_node1 13040 1726882405.00497: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882405.00499: Calling all_plugins_play to load vars for managed_node1 13040 1726882405.00501: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882405.00502: Calling groups_plugins_play to load vars for managed_node1 13040 1726882405.00583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882405.00687: done with get_vars() 13040 1726882405.00695: done queuing things up, now waiting for results queue to drain 13040 1726882405.00696: results queue empty 13040 1726882405.00696: checking for any_errors_fatal 13040 1726882405.00697: done checking for any_errors_fatal 13040 1726882405.00697: checking for max_fail_percentage 13040 1726882405.00698: done checking for max_fail_percentage 13040 1726882405.00698: checking to see if all hosts have failed and the running result is not ok 13040 1726882405.00699: done checking to see if all hosts have failed 13040 1726882405.00699: getting the remaining hosts for this loop 13040 1726882405.00700: done getting the remaining hosts for this loop 13040 1726882405.00701: getting the next task for host managed_node1 13040 1726882405.00704: done getting next task for host managed_node1 13040 1726882405.00704: ^ task is: None 13040 1726882405.00705: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882405.00706: done queuing things up, now waiting for results queue to drain 13040 1726882405.00706: results queue empty 13040 1726882405.00707: checking for any_errors_fatal 13040 1726882405.00707: done checking for any_errors_fatal 13040 1726882405.00708: checking for max_fail_percentage 13040 1726882405.00708: done checking for max_fail_percentage 13040 1726882405.00709: checking to see if all hosts have failed and the running result is not ok 13040 1726882405.00709: done checking to see if all hosts have failed 13040 1726882405.00710: getting the next task for host managed_node1 13040 1726882405.00712: done getting next task for host managed_node1 13040 1726882405.00712: ^ task is: None 13040 1726882405.00713: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882405.00748: in VariableManager get_vars() 13040 1726882405.00776: done with get_vars() 13040 1726882405.00781: in VariableManager get_vars() 13040 1726882405.00795: done with get_vars() 13040 1726882405.00798: variable 'omit' from source: magic vars 13040 1726882405.00819: in VariableManager get_vars() 13040 1726882405.00833: done with get_vars() 13040 1726882405.00846: variable 'omit' from source: magic vars PLAY [Play for testing bond removal] ******************************************* 13040 1726882405.01426: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 13040 1726882405.01448: getting the remaining hosts for this loop 13040 1726882405.01449: done getting the remaining hosts for this loop 13040 1726882405.01451: getting the next task for host managed_node1 13040 1726882405.01454: done getting next task for host managed_node1 13040 1726882405.01456: ^ task is: TASK: Gathering Facts 13040 1726882405.01456: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882405.01458: getting variables 13040 1726882405.01458: in VariableManager get_vars() 13040 1726882405.01473: Calling all_inventory to load vars for managed_node1 13040 1726882405.01475: Calling groups_inventory to load vars for managed_node1 13040 1726882405.01476: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882405.01479: Calling all_plugins_play to load vars for managed_node1 13040 1726882405.01488: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882405.01490: Calling groups_plugins_play to load vars for managed_node1 13040 1726882405.01570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882405.01675: done with get_vars() 13040 1726882405.01680: done getting variables 13040 1726882405.01705: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:3 Friday 20 September 2024 21:33:25 -0400 (0:00:00.041) 0:00:02.494 ****** 13040 1726882405.01720: entering _queue_task() for managed_node1/gather_facts 13040 1726882405.01937: worker is 1 (out of 1 available) 13040 1726882405.01948: exiting _queue_task() for managed_node1/gather_facts 13040 1726882405.01962: done queuing things up, now waiting for results queue to drain 13040 1726882405.01965: waiting for pending results... 13040 1726882405.02143: running TaskExecutor() for managed_node1/TASK: Gathering Facts 13040 1726882405.02232: in run() - task 0e448fcc-3ce9-b123-314b-000000000218 13040 1726882405.02243: variable 'ansible_search_path' from source: unknown 13040 1726882405.02276: calling self._execute() 13040 1726882405.02362: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882405.02375: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882405.02387: variable 'omit' from source: magic vars 13040 1726882405.02868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882405.05258: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882405.05306: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882405.05333: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882405.05359: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882405.05380: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882405.05448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882405.05472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882405.05492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882405.05521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882405.05532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882405.05630: variable 'ansible_distribution' from source: facts 13040 1726882405.05633: variable 'ansible_distribution_major_version' from source: facts 13040 1726882405.05648: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882405.05651: when evaluation is False, skipping this task 13040 1726882405.05653: _execute() done 13040 1726882405.05658: dumping result to json 13040 1726882405.05662: done dumping result, returning 13040 1726882405.05669: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-b123-314b-000000000218] 13040 1726882405.05674: sending task result for task 0e448fcc-3ce9-b123-314b-000000000218 13040 1726882405.05753: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000218 13040 1726882405.05756: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882405.05800: no more pending results, returning what we have 13040 1726882405.05803: results queue empty 13040 1726882405.05804: checking for any_errors_fatal 13040 1726882405.05805: done checking for any_errors_fatal 13040 1726882405.05806: checking for max_fail_percentage 13040 1726882405.05808: done checking for max_fail_percentage 13040 1726882405.05808: checking to see if all hosts have failed and the running result is not ok 13040 1726882405.05809: done checking to see if all hosts have failed 13040 1726882405.05810: getting the remaining hosts for this loop 13040 1726882405.05811: done getting the remaining hosts for this loop 13040 1726882405.05814: getting the next task for host managed_node1 13040 1726882405.05821: done getting next task for host managed_node1 13040 1726882405.05823: ^ task is: TASK: meta (flush_handlers) 13040 1726882405.05825: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882405.05829: getting variables 13040 1726882405.05830: in VariableManager get_vars() 13040 1726882405.05886: Calling all_inventory to load vars for managed_node1 13040 1726882405.05889: Calling groups_inventory to load vars for managed_node1 13040 1726882405.05891: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882405.05902: Calling all_plugins_play to load vars for managed_node1 13040 1726882405.05904: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882405.05906: Calling groups_plugins_play to load vars for managed_node1 13040 1726882405.06198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882405.06312: done with get_vars() 13040 1726882405.06319: done getting variables 13040 1726882405.06369: in VariableManager get_vars() 13040 1726882405.06382: Calling all_inventory to load vars for managed_node1 13040 1726882405.06384: Calling groups_inventory to load vars for managed_node1 13040 1726882405.06385: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882405.06388: Calling all_plugins_play to load vars for managed_node1 13040 1726882405.06389: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882405.06391: Calling groups_plugins_play to load vars for managed_node1 13040 1726882405.06471: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882405.06688: done with get_vars() 13040 1726882405.06700: done queuing things up, now waiting for results queue to drain 13040 1726882405.06702: results queue empty 13040 1726882405.06703: checking for any_errors_fatal 13040 1726882405.06705: done checking for any_errors_fatal 13040 1726882405.06705: checking for max_fail_percentage 13040 1726882405.06707: done checking for max_fail_percentage 13040 1726882405.06707: checking to see if all hosts have failed and the running result is not ok 13040 1726882405.06708: done checking to see if all hosts have failed 13040 1726882405.06709: getting the remaining hosts for this loop 13040 1726882405.06710: done getting the remaining hosts for this loop 13040 1726882405.06712: getting the next task for host managed_node1 13040 1726882405.06715: done getting next task for host managed_node1 13040 1726882405.06718: ^ task is: TASK: INIT Prepare setup 13040 1726882405.06720: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882405.06722: getting variables 13040 1726882405.06723: in VariableManager get_vars() 13040 1726882405.06740: Calling all_inventory to load vars for managed_node1 13040 1726882405.06742: Calling groups_inventory to load vars for managed_node1 13040 1726882405.06744: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882405.06748: Calling all_plugins_play to load vars for managed_node1 13040 1726882405.06758: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882405.06761: Calling groups_plugins_play to load vars for managed_node1 13040 1726882405.06897: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882405.07091: done with get_vars() 13040 1726882405.07099: done getting variables 13040 1726882405.07173: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [INIT Prepare setup] ****************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:15 Friday 20 September 2024 21:33:25 -0400 (0:00:00.054) 0:00:02.549 ****** 13040 1726882405.07198: entering _queue_task() for managed_node1/debug 13040 1726882405.07199: Creating lock for debug 13040 1726882405.07625: worker is 1 (out of 1 available) 13040 1726882405.07639: exiting _queue_task() for managed_node1/debug 13040 1726882405.07649: done queuing things up, now waiting for results queue to drain 13040 1726882405.07654: waiting for pending results... 13040 1726882405.07833: running TaskExecutor() for managed_node1/TASK: INIT Prepare setup 13040 1726882405.07927: in run() - task 0e448fcc-3ce9-b123-314b-00000000000b 13040 1726882405.07945: variable 'ansible_search_path' from source: unknown 13040 1726882405.07992: calling self._execute() 13040 1726882405.08090: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882405.08102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882405.08119: variable 'omit' from source: magic vars 13040 1726882405.08584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882405.10431: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882405.10491: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882405.10517: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882405.10543: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882405.10568: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882405.10644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882405.10668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882405.10686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882405.10714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882405.10725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882405.10822: variable 'ansible_distribution' from source: facts 13040 1726882405.10827: variable 'ansible_distribution_major_version' from source: facts 13040 1726882405.10842: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882405.10845: when evaluation is False, skipping this task 13040 1726882405.10848: _execute() done 13040 1726882405.10850: dumping result to json 13040 1726882405.10852: done dumping result, returning 13040 1726882405.10862: done running TaskExecutor() for managed_node1/TASK: INIT Prepare setup [0e448fcc-3ce9-b123-314b-00000000000b] 13040 1726882405.10870: sending task result for task 0e448fcc-3ce9-b123-314b-00000000000b 13040 1726882405.10949: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000000b 13040 1726882405.10952: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 13040 1726882405.11008: no more pending results, returning what we have 13040 1726882405.11012: results queue empty 13040 1726882405.11013: checking for any_errors_fatal 13040 1726882405.11014: done checking for any_errors_fatal 13040 1726882405.11015: checking for max_fail_percentage 13040 1726882405.11017: done checking for max_fail_percentage 13040 1726882405.11017: checking to see if all hosts have failed and the running result is not ok 13040 1726882405.11018: done checking to see if all hosts have failed 13040 1726882405.11019: getting the remaining hosts for this loop 13040 1726882405.11020: done getting the remaining hosts for this loop 13040 1726882405.11024: getting the next task for host managed_node1 13040 1726882405.11031: done getting next task for host managed_node1 13040 1726882405.11034: ^ task is: TASK: Install dnsmasq 13040 1726882405.11037: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882405.11040: getting variables 13040 1726882405.11042: in VariableManager get_vars() 13040 1726882405.11096: Calling all_inventory to load vars for managed_node1 13040 1726882405.11099: Calling groups_inventory to load vars for managed_node1 13040 1726882405.11102: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882405.11111: Calling all_plugins_play to load vars for managed_node1 13040 1726882405.11113: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882405.11116: Calling groups_plugins_play to load vars for managed_node1 13040 1726882405.11291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882405.11505: done with get_vars() 13040 1726882405.11516: done getting variables 13040 1726882405.11566: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 21:33:25 -0400 (0:00:00.043) 0:00:02.593 ****** 13040 1726882405.11589: entering _queue_task() for managed_node1/package 13040 1726882405.11790: worker is 1 (out of 1 available) 13040 1726882405.11803: exiting _queue_task() for managed_node1/package 13040 1726882405.11816: done queuing things up, now waiting for results queue to drain 13040 1726882405.11817: waiting for pending results... 13040 1726882405.11982: running TaskExecutor() for managed_node1/TASK: Install dnsmasq 13040 1726882405.12069: in run() - task 0e448fcc-3ce9-b123-314b-00000000000f 13040 1726882405.12083: variable 'ansible_search_path' from source: unknown 13040 1726882405.12086: variable 'ansible_search_path' from source: unknown 13040 1726882405.12116: calling self._execute() 13040 1726882405.12190: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882405.12196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882405.12210: variable 'omit' from source: magic vars 13040 1726882405.12626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882405.14334: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882405.14384: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882405.14410: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882405.14435: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882405.14457: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882405.14524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882405.14545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882405.14564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882405.14592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882405.14604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882405.14703: variable 'ansible_distribution' from source: facts 13040 1726882405.14708: variable 'ansible_distribution_major_version' from source: facts 13040 1726882405.14722: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882405.14726: when evaluation is False, skipping this task 13040 1726882405.14728: _execute() done 13040 1726882405.14730: dumping result to json 13040 1726882405.14733: done dumping result, returning 13040 1726882405.14740: done running TaskExecutor() for managed_node1/TASK: Install dnsmasq [0e448fcc-3ce9-b123-314b-00000000000f] 13040 1726882405.14745: sending task result for task 0e448fcc-3ce9-b123-314b-00000000000f 13040 1726882405.14833: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000000f 13040 1726882405.14836: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882405.14890: no more pending results, returning what we have 13040 1726882405.14894: results queue empty 13040 1726882405.14895: checking for any_errors_fatal 13040 1726882405.14902: done checking for any_errors_fatal 13040 1726882405.14902: checking for max_fail_percentage 13040 1726882405.14904: done checking for max_fail_percentage 13040 1726882405.14905: checking to see if all hosts have failed and the running result is not ok 13040 1726882405.14905: done checking to see if all hosts have failed 13040 1726882405.14906: getting the remaining hosts for this loop 13040 1726882405.14907: done getting the remaining hosts for this loop 13040 1726882405.14911: getting the next task for host managed_node1 13040 1726882405.14916: done getting next task for host managed_node1 13040 1726882405.14919: ^ task is: TASK: Install pgrep, sysctl 13040 1726882405.14921: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882405.14924: getting variables 13040 1726882405.14926: in VariableManager get_vars() 13040 1726882405.14981: Calling all_inventory to load vars for managed_node1 13040 1726882405.14984: Calling groups_inventory to load vars for managed_node1 13040 1726882405.14986: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882405.14995: Calling all_plugins_play to load vars for managed_node1 13040 1726882405.14997: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882405.15000: Calling groups_plugins_play to load vars for managed_node1 13040 1726882405.15145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882405.15275: done with get_vars() 13040 1726882405.15283: done getting variables 13040 1726882405.15322: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Friday 20 September 2024 21:33:25 -0400 (0:00:00.037) 0:00:02.630 ****** 13040 1726882405.15344: entering _queue_task() for managed_node1/package 13040 1726882405.15536: worker is 1 (out of 1 available) 13040 1726882405.15550: exiting _queue_task() for managed_node1/package 13040 1726882405.15567: done queuing things up, now waiting for results queue to drain 13040 1726882405.15568: waiting for pending results... 13040 1726882405.15715: running TaskExecutor() for managed_node1/TASK: Install pgrep, sysctl 13040 1726882405.15803: in run() - task 0e448fcc-3ce9-b123-314b-000000000010 13040 1726882405.15816: variable 'ansible_search_path' from source: unknown 13040 1726882405.15820: variable 'ansible_search_path' from source: unknown 13040 1726882405.15848: calling self._execute() 13040 1726882405.15917: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882405.15921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882405.15930: variable 'omit' from source: magic vars 13040 1726882405.16232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882405.17838: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882405.17913: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882405.17940: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882405.17970: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882405.17991: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882405.18047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882405.18071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882405.18089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882405.18115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882405.18125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882405.18222: variable 'ansible_distribution' from source: facts 13040 1726882405.18226: variable 'ansible_distribution_major_version' from source: facts 13040 1726882405.18242: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882405.18245: when evaluation is False, skipping this task 13040 1726882405.18247: _execute() done 13040 1726882405.18250: dumping result to json 13040 1726882405.18254: done dumping result, returning 13040 1726882405.18258: done running TaskExecutor() for managed_node1/TASK: Install pgrep, sysctl [0e448fcc-3ce9-b123-314b-000000000010] 13040 1726882405.18266: sending task result for task 0e448fcc-3ce9-b123-314b-000000000010 13040 1726882405.18357: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000010 13040 1726882405.18359: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882405.18408: no more pending results, returning what we have 13040 1726882405.18412: results queue empty 13040 1726882405.18413: checking for any_errors_fatal 13040 1726882405.18417: done checking for any_errors_fatal 13040 1726882405.18418: checking for max_fail_percentage 13040 1726882405.18420: done checking for max_fail_percentage 13040 1726882405.18420: checking to see if all hosts have failed and the running result is not ok 13040 1726882405.18421: done checking to see if all hosts have failed 13040 1726882405.18422: getting the remaining hosts for this loop 13040 1726882405.18423: done getting the remaining hosts for this loop 13040 1726882405.18426: getting the next task for host managed_node1 13040 1726882405.18432: done getting next task for host managed_node1 13040 1726882405.18435: ^ task is: TASK: Install pgrep, sysctl 13040 1726882405.18438: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882405.18441: getting variables 13040 1726882405.18442: in VariableManager get_vars() 13040 1726882405.18496: Calling all_inventory to load vars for managed_node1 13040 1726882405.18503: Calling groups_inventory to load vars for managed_node1 13040 1726882405.18506: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882405.18515: Calling all_plugins_play to load vars for managed_node1 13040 1726882405.18517: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882405.18519: Calling groups_plugins_play to load vars for managed_node1 13040 1726882405.18636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882405.18762: done with get_vars() 13040 1726882405.18772: done getting variables 13040 1726882405.18811: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Friday 20 September 2024 21:33:25 -0400 (0:00:00.034) 0:00:02.665 ****** 13040 1726882405.18832: entering _queue_task() for managed_node1/package 13040 1726882405.19056: worker is 1 (out of 1 available) 13040 1726882405.19070: exiting _queue_task() for managed_node1/package 13040 1726882405.19083: done queuing things up, now waiting for results queue to drain 13040 1726882405.19084: waiting for pending results... 13040 1726882405.19327: running TaskExecutor() for managed_node1/TASK: Install pgrep, sysctl 13040 1726882405.19450: in run() - task 0e448fcc-3ce9-b123-314b-000000000011 13040 1726882405.19481: variable 'ansible_search_path' from source: unknown 13040 1726882405.19487: variable 'ansible_search_path' from source: unknown 13040 1726882405.19527: calling self._execute() 13040 1726882405.19609: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882405.19619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882405.19634: variable 'omit' from source: magic vars 13040 1726882405.20038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882405.21850: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882405.21902: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882405.21928: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882405.21956: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882405.21976: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882405.22034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882405.22056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882405.22074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882405.22102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882405.22113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882405.22211: variable 'ansible_distribution' from source: facts 13040 1726882405.22215: variable 'ansible_distribution_major_version' from source: facts 13040 1726882405.22232: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882405.22235: when evaluation is False, skipping this task 13040 1726882405.22237: _execute() done 13040 1726882405.22240: dumping result to json 13040 1726882405.22241: done dumping result, returning 13040 1726882405.22247: done running TaskExecutor() for managed_node1/TASK: Install pgrep, sysctl [0e448fcc-3ce9-b123-314b-000000000011] 13040 1726882405.22255: sending task result for task 0e448fcc-3ce9-b123-314b-000000000011 13040 1726882405.22348: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000011 13040 1726882405.22353: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882405.22408: no more pending results, returning what we have 13040 1726882405.22413: results queue empty 13040 1726882405.22414: checking for any_errors_fatal 13040 1726882405.22420: done checking for any_errors_fatal 13040 1726882405.22421: checking for max_fail_percentage 13040 1726882405.22422: done checking for max_fail_percentage 13040 1726882405.22423: checking to see if all hosts have failed and the running result is not ok 13040 1726882405.22424: done checking to see if all hosts have failed 13040 1726882405.22425: getting the remaining hosts for this loop 13040 1726882405.22426: done getting the remaining hosts for this loop 13040 1726882405.22429: getting the next task for host managed_node1 13040 1726882405.22436: done getting next task for host managed_node1 13040 1726882405.22438: ^ task is: TASK: Create test interfaces 13040 1726882405.22441: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882405.22445: getting variables 13040 1726882405.22447: in VariableManager get_vars() 13040 1726882405.22514: Calling all_inventory to load vars for managed_node1 13040 1726882405.22518: Calling groups_inventory to load vars for managed_node1 13040 1726882405.22520: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882405.22532: Calling all_plugins_play to load vars for managed_node1 13040 1726882405.22535: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882405.22538: Calling groups_plugins_play to load vars for managed_node1 13040 1726882405.22778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882405.22975: done with get_vars() 13040 1726882405.22986: done getting variables 13040 1726882405.23090: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Friday 20 September 2024 21:33:25 -0400 (0:00:00.042) 0:00:02.708 ****** 13040 1726882405.23118: entering _queue_task() for managed_node1/shell 13040 1726882405.23120: Creating lock for shell 13040 1726882405.23402: worker is 1 (out of 1 available) 13040 1726882405.23416: exiting _queue_task() for managed_node1/shell 13040 1726882405.23427: done queuing things up, now waiting for results queue to drain 13040 1726882405.23429: waiting for pending results... 13040 1726882405.23824: running TaskExecutor() for managed_node1/TASK: Create test interfaces 13040 1726882405.23930: in run() - task 0e448fcc-3ce9-b123-314b-000000000012 13040 1726882405.23940: variable 'ansible_search_path' from source: unknown 13040 1726882405.23944: variable 'ansible_search_path' from source: unknown 13040 1726882405.23977: calling self._execute() 13040 1726882405.24040: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882405.24043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882405.24054: variable 'omit' from source: magic vars 13040 1726882405.24370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882405.26136: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882405.26342: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882405.26388: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882405.26442: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882405.26482: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882405.26560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882405.26598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882405.26638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882405.26718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882405.26738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882405.26930: variable 'ansible_distribution' from source: facts 13040 1726882405.26942: variable 'ansible_distribution_major_version' from source: facts 13040 1726882405.26992: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882405.27000: when evaluation is False, skipping this task 13040 1726882405.27007: _execute() done 13040 1726882405.27014: dumping result to json 13040 1726882405.27020: done dumping result, returning 13040 1726882405.27030: done running TaskExecutor() for managed_node1/TASK: Create test interfaces [0e448fcc-3ce9-b123-314b-000000000012] 13040 1726882405.27041: sending task result for task 0e448fcc-3ce9-b123-314b-000000000012 13040 1726882405.27154: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000012 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882405.27205: no more pending results, returning what we have 13040 1726882405.27209: results queue empty 13040 1726882405.27210: checking for any_errors_fatal 13040 1726882405.27215: done checking for any_errors_fatal 13040 1726882405.27216: checking for max_fail_percentage 13040 1726882405.27218: done checking for max_fail_percentage 13040 1726882405.27218: checking to see if all hosts have failed and the running result is not ok 13040 1726882405.27219: done checking to see if all hosts have failed 13040 1726882405.27220: getting the remaining hosts for this loop 13040 1726882405.27221: done getting the remaining hosts for this loop 13040 1726882405.27224: getting the next task for host managed_node1 13040 1726882405.27234: done getting next task for host managed_node1 13040 1726882405.27236: ^ task is: TASK: Include the task 'get_interface_stat.yml' 13040 1726882405.27239: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882405.27242: getting variables 13040 1726882405.27244: in VariableManager get_vars() 13040 1726882405.27299: Calling all_inventory to load vars for managed_node1 13040 1726882405.27302: Calling groups_inventory to load vars for managed_node1 13040 1726882405.27304: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882405.27316: Calling all_plugins_play to load vars for managed_node1 13040 1726882405.27318: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882405.27322: Calling groups_plugins_play to load vars for managed_node1 13040 1726882405.27493: WORKER PROCESS EXITING 13040 1726882405.27507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882405.27714: done with get_vars() 13040 1726882405.27724: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:33:25 -0400 (0:00:00.046) 0:00:02.755 ****** 13040 1726882405.27816: entering _queue_task() for managed_node1/include_tasks 13040 1726882405.28145: worker is 1 (out of 1 available) 13040 1726882405.28158: exiting _queue_task() for managed_node1/include_tasks 13040 1726882405.28171: done queuing things up, now waiting for results queue to drain 13040 1726882405.28172: waiting for pending results... 13040 1726882405.28434: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 13040 1726882405.28517: in run() - task 0e448fcc-3ce9-b123-314b-000000000016 13040 1726882405.28529: variable 'ansible_search_path' from source: unknown 13040 1726882405.28532: variable 'ansible_search_path' from source: unknown 13040 1726882405.28569: calling self._execute() 13040 1726882405.28632: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882405.28636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882405.28644: variable 'omit' from source: magic vars 13040 1726882405.28958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882405.31049: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882405.31111: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882405.31139: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882405.31177: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882405.31211: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882405.31269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882405.31295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882405.31313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882405.31343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882405.31356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882405.31456: variable 'ansible_distribution' from source: facts 13040 1726882405.31460: variable 'ansible_distribution_major_version' from source: facts 13040 1726882405.31476: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882405.31479: when evaluation is False, skipping this task 13040 1726882405.31481: _execute() done 13040 1726882405.31484: dumping result to json 13040 1726882405.31486: done dumping result, returning 13040 1726882405.31493: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-b123-314b-000000000016] 13040 1726882405.31498: sending task result for task 0e448fcc-3ce9-b123-314b-000000000016 13040 1726882405.31590: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000016 13040 1726882405.31592: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882405.31636: no more pending results, returning what we have 13040 1726882405.31640: results queue empty 13040 1726882405.31640: checking for any_errors_fatal 13040 1726882405.31647: done checking for any_errors_fatal 13040 1726882405.31648: checking for max_fail_percentage 13040 1726882405.31649: done checking for max_fail_percentage 13040 1726882405.31650: checking to see if all hosts have failed and the running result is not ok 13040 1726882405.31651: done checking to see if all hosts have failed 13040 1726882405.31653: getting the remaining hosts for this loop 13040 1726882405.31655: done getting the remaining hosts for this loop 13040 1726882405.31658: getting the next task for host managed_node1 13040 1726882405.31670: done getting next task for host managed_node1 13040 1726882405.31673: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 13040 1726882405.31676: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882405.31679: getting variables 13040 1726882405.31680: in VariableManager get_vars() 13040 1726882405.31729: Calling all_inventory to load vars for managed_node1 13040 1726882405.31731: Calling groups_inventory to load vars for managed_node1 13040 1726882405.31734: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882405.31743: Calling all_plugins_play to load vars for managed_node1 13040 1726882405.31745: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882405.31747: Calling groups_plugins_play to load vars for managed_node1 13040 1726882405.31910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882405.32029: done with get_vars() 13040 1726882405.32036: done getting variables 13040 1726882405.32107: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 13040 1726882405.32197: variable 'interface' from source: task vars 13040 1726882405.32201: variable 'dhcp_interface1' from source: play vars 13040 1726882405.32245: variable 'dhcp_interface1' from source: play vars TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:33:25 -0400 (0:00:00.044) 0:00:02.800 ****** 13040 1726882405.32279: entering _queue_task() for managed_node1/assert 13040 1726882405.32281: Creating lock for assert 13040 1726882405.32480: worker is 1 (out of 1 available) 13040 1726882405.32493: exiting _queue_task() for managed_node1/assert 13040 1726882405.32504: done queuing things up, now waiting for results queue to drain 13040 1726882405.32506: waiting for pending results... 13040 1726882405.32658: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'test1' 13040 1726882405.32733: in run() - task 0e448fcc-3ce9-b123-314b-000000000017 13040 1726882405.32747: variable 'ansible_search_path' from source: unknown 13040 1726882405.32750: variable 'ansible_search_path' from source: unknown 13040 1726882405.32782: calling self._execute() 13040 1726882405.32845: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882405.32849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882405.32860: variable 'omit' from source: magic vars 13040 1726882405.33226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882405.35616: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882405.35672: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882405.35704: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882405.35769: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882405.35809: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882405.35892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882405.35928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882405.35969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882405.36025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882405.36049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882405.36211: variable 'ansible_distribution' from source: facts 13040 1726882405.36224: variable 'ansible_distribution_major_version' from source: facts 13040 1726882405.36255: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882405.36272: when evaluation is False, skipping this task 13040 1726882405.36281: _execute() done 13040 1726882405.36287: dumping result to json 13040 1726882405.36294: done dumping result, returning 13040 1726882405.36309: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'test1' [0e448fcc-3ce9-b123-314b-000000000017] 13040 1726882405.36324: sending task result for task 0e448fcc-3ce9-b123-314b-000000000017 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882405.36512: no more pending results, returning what we have 13040 1726882405.36516: results queue empty 13040 1726882405.36516: checking for any_errors_fatal 13040 1726882405.36523: done checking for any_errors_fatal 13040 1726882405.36523: checking for max_fail_percentage 13040 1726882405.36525: done checking for max_fail_percentage 13040 1726882405.36526: checking to see if all hosts have failed and the running result is not ok 13040 1726882405.36527: done checking to see if all hosts have failed 13040 1726882405.36527: getting the remaining hosts for this loop 13040 1726882405.36528: done getting the remaining hosts for this loop 13040 1726882405.36532: getting the next task for host managed_node1 13040 1726882405.36550: done getting next task for host managed_node1 13040 1726882405.36554: ^ task is: TASK: Include the task 'get_interface_stat.yml' 13040 1726882405.36560: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882405.36567: getting variables 13040 1726882405.36573: in VariableManager get_vars() 13040 1726882405.36644: Calling all_inventory to load vars for managed_node1 13040 1726882405.36647: Calling groups_inventory to load vars for managed_node1 13040 1726882405.36649: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882405.36700: Calling all_plugins_play to load vars for managed_node1 13040 1726882405.36704: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882405.36716: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000017 13040 1726882405.36718: WORKER PROCESS EXITING 13040 1726882405.36726: Calling groups_plugins_play to load vars for managed_node1 13040 1726882405.37172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882405.37407: done with get_vars() 13040 1726882405.37418: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:33:25 -0400 (0:00:00.052) 0:00:02.852 ****** 13040 1726882405.37514: entering _queue_task() for managed_node1/include_tasks 13040 1726882405.37866: worker is 1 (out of 1 available) 13040 1726882405.37878: exiting _queue_task() for managed_node1/include_tasks 13040 1726882405.37888: done queuing things up, now waiting for results queue to drain 13040 1726882405.37889: waiting for pending results... 13040 1726882405.38135: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 13040 1726882405.38291: in run() - task 0e448fcc-3ce9-b123-314b-00000000001b 13040 1726882405.38312: variable 'ansible_search_path' from source: unknown 13040 1726882405.38321: variable 'ansible_search_path' from source: unknown 13040 1726882405.38383: calling self._execute() 13040 1726882405.38520: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882405.38530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882405.38543: variable 'omit' from source: magic vars 13040 1726882405.39604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882405.41836: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882405.41908: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882405.41955: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882405.41995: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882405.42026: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882405.42115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882405.42147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882405.42182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882405.42228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882405.42266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882405.42408: variable 'ansible_distribution' from source: facts 13040 1726882405.42419: variable 'ansible_distribution_major_version' from source: facts 13040 1726882405.42441: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882405.42448: when evaluation is False, skipping this task 13040 1726882405.42459: _execute() done 13040 1726882405.42469: dumping result to json 13040 1726882405.42477: done dumping result, returning 13040 1726882405.42498: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-b123-314b-00000000001b] 13040 1726882405.42514: sending task result for task 0e448fcc-3ce9-b123-314b-00000000001b skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882405.42709: no more pending results, returning what we have 13040 1726882405.42713: results queue empty 13040 1726882405.42714: checking for any_errors_fatal 13040 1726882405.42721: done checking for any_errors_fatal 13040 1726882405.42722: checking for max_fail_percentage 13040 1726882405.42723: done checking for max_fail_percentage 13040 1726882405.42724: checking to see if all hosts have failed and the running result is not ok 13040 1726882405.42725: done checking to see if all hosts have failed 13040 1726882405.42726: getting the remaining hosts for this loop 13040 1726882405.42727: done getting the remaining hosts for this loop 13040 1726882405.42731: getting the next task for host managed_node1 13040 1726882405.42739: done getting next task for host managed_node1 13040 1726882405.42742: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 13040 1726882405.42745: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882405.42749: getting variables 13040 1726882405.42751: in VariableManager get_vars() 13040 1726882405.42807: Calling all_inventory to load vars for managed_node1 13040 1726882405.42810: Calling groups_inventory to load vars for managed_node1 13040 1726882405.42812: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882405.42823: Calling all_plugins_play to load vars for managed_node1 13040 1726882405.42826: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882405.42829: Calling groups_plugins_play to load vars for managed_node1 13040 1726882405.43041: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000001b 13040 1726882405.43045: WORKER PROCESS EXITING 13040 1726882405.43059: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882405.43262: done with get_vars() 13040 1726882405.43274: done getting variables 13040 1726882405.43327: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13040 1726882405.43442: variable 'interface' from source: task vars 13040 1726882405.43446: variable 'dhcp_interface2' from source: play vars 13040 1726882405.43508: variable 'dhcp_interface2' from source: play vars TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:33:25 -0400 (0:00:00.060) 0:00:02.912 ****** 13040 1726882405.43537: entering _queue_task() for managed_node1/assert 13040 1726882405.43845: worker is 1 (out of 1 available) 13040 1726882405.43857: exiting _queue_task() for managed_node1/assert 13040 1726882405.43871: done queuing things up, now waiting for results queue to drain 13040 1726882405.43873: waiting for pending results... 13040 1726882405.44195: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'test2' 13040 1726882405.44314: in run() - task 0e448fcc-3ce9-b123-314b-00000000001c 13040 1726882405.44334: variable 'ansible_search_path' from source: unknown 13040 1726882405.44342: variable 'ansible_search_path' from source: unknown 13040 1726882405.44387: calling self._execute() 13040 1726882405.44487: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882405.44499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882405.44514: variable 'omit' from source: magic vars 13040 1726882405.44940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882405.47303: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882405.47377: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882405.47417: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882405.47459: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882405.47495: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882405.47581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882405.47617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882405.47647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882405.47698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882405.47718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882405.47858: variable 'ansible_distribution' from source: facts 13040 1726882405.47876: variable 'ansible_distribution_major_version' from source: facts 13040 1726882405.47899: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882405.47907: when evaluation is False, skipping this task 13040 1726882405.47914: _execute() done 13040 1726882405.47920: dumping result to json 13040 1726882405.47927: done dumping result, returning 13040 1726882405.47938: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'test2' [0e448fcc-3ce9-b123-314b-00000000001c] 13040 1726882405.47947: sending task result for task 0e448fcc-3ce9-b123-314b-00000000001c skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882405.48095: no more pending results, returning what we have 13040 1726882405.48098: results queue empty 13040 1726882405.48099: checking for any_errors_fatal 13040 1726882405.48105: done checking for any_errors_fatal 13040 1726882405.48106: checking for max_fail_percentage 13040 1726882405.48107: done checking for max_fail_percentage 13040 1726882405.48109: checking to see if all hosts have failed and the running result is not ok 13040 1726882405.48110: done checking to see if all hosts have failed 13040 1726882405.48111: getting the remaining hosts for this loop 13040 1726882405.48112: done getting the remaining hosts for this loop 13040 1726882405.48116: getting the next task for host managed_node1 13040 1726882405.48124: done getting next task for host managed_node1 13040 1726882405.48126: ^ task is: TASK: Backup the /etc/resolv.conf for initscript 13040 1726882405.48128: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882405.48131: getting variables 13040 1726882405.48133: in VariableManager get_vars() 13040 1726882405.48186: Calling all_inventory to load vars for managed_node1 13040 1726882405.48189: Calling groups_inventory to load vars for managed_node1 13040 1726882405.48191: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882405.48203: Calling all_plugins_play to load vars for managed_node1 13040 1726882405.48206: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882405.48209: Calling groups_plugins_play to load vars for managed_node1 13040 1726882405.48356: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000001c 13040 1726882405.48359: WORKER PROCESS EXITING 13040 1726882405.48391: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882405.48602: done with get_vars() 13040 1726882405.48613: done getting variables 13040 1726882405.48678: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Backup the /etc/resolv.conf for initscript] ****************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:28 Friday 20 September 2024 21:33:25 -0400 (0:00:00.051) 0:00:02.964 ****** 13040 1726882405.48706: entering _queue_task() for managed_node1/command 13040 1726882405.48956: worker is 1 (out of 1 available) 13040 1726882405.48968: exiting _queue_task() for managed_node1/command 13040 1726882405.48980: done queuing things up, now waiting for results queue to drain 13040 1726882405.48981: waiting for pending results... 13040 1726882405.49248: running TaskExecutor() for managed_node1/TASK: Backup the /etc/resolv.conf for initscript 13040 1726882405.49353: in run() - task 0e448fcc-3ce9-b123-314b-00000000001d 13040 1726882405.49375: variable 'ansible_search_path' from source: unknown 13040 1726882405.49416: calling self._execute() 13040 1726882405.49510: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882405.49522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882405.49541: variable 'omit' from source: magic vars 13040 1726882405.50116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882405.52523: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882405.52602: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882405.52643: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882405.52686: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882405.52724: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882405.52824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882405.52857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882405.52891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882405.52945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882405.52966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882405.53114: variable 'ansible_distribution' from source: facts 13040 1726882405.53130: variable 'ansible_distribution_major_version' from source: facts 13040 1726882405.53156: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882405.53168: when evaluation is False, skipping this task 13040 1726882405.53176: _execute() done 13040 1726882405.53183: dumping result to json 13040 1726882405.53191: done dumping result, returning 13040 1726882405.53202: done running TaskExecutor() for managed_node1/TASK: Backup the /etc/resolv.conf for initscript [0e448fcc-3ce9-b123-314b-00000000001d] 13040 1726882405.53211: sending task result for task 0e448fcc-3ce9-b123-314b-00000000001d 13040 1726882405.53326: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000001d 13040 1726882405.53333: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882405.53390: no more pending results, returning what we have 13040 1726882405.53395: results queue empty 13040 1726882405.53396: checking for any_errors_fatal 13040 1726882405.53401: done checking for any_errors_fatal 13040 1726882405.53402: checking for max_fail_percentage 13040 1726882405.53404: done checking for max_fail_percentage 13040 1726882405.53405: checking to see if all hosts have failed and the running result is not ok 13040 1726882405.53406: done checking to see if all hosts have failed 13040 1726882405.53406: getting the remaining hosts for this loop 13040 1726882405.53408: done getting the remaining hosts for this loop 13040 1726882405.53412: getting the next task for host managed_node1 13040 1726882405.53419: done getting next task for host managed_node1 13040 1726882405.53422: ^ task is: TASK: TEST Add Bond with 2 ports 13040 1726882405.53424: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882405.53427: getting variables 13040 1726882405.53429: in VariableManager get_vars() 13040 1726882405.53489: Calling all_inventory to load vars for managed_node1 13040 1726882405.53492: Calling groups_inventory to load vars for managed_node1 13040 1726882405.53495: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882405.53507: Calling all_plugins_play to load vars for managed_node1 13040 1726882405.53510: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882405.53513: Calling groups_plugins_play to load vars for managed_node1 13040 1726882405.53744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882405.53953: done with get_vars() 13040 1726882405.53965: done getting variables 13040 1726882405.54140: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [TEST Add Bond with 2 ports] ********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:33 Friday 20 September 2024 21:33:25 -0400 (0:00:00.054) 0:00:03.018 ****** 13040 1726882405.54174: entering _queue_task() for managed_node1/debug 13040 1726882405.54592: worker is 1 (out of 1 available) 13040 1726882405.54605: exiting _queue_task() for managed_node1/debug 13040 1726882405.54617: done queuing things up, now waiting for results queue to drain 13040 1726882405.54618: waiting for pending results... 13040 1726882405.54886: running TaskExecutor() for managed_node1/TASK: TEST Add Bond with 2 ports 13040 1726882405.54989: in run() - task 0e448fcc-3ce9-b123-314b-00000000001e 13040 1726882405.55008: variable 'ansible_search_path' from source: unknown 13040 1726882405.55052: calling self._execute() 13040 1726882405.55149: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882405.55161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882405.55183: variable 'omit' from source: magic vars 13040 1726882405.55648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882405.58186: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882405.58273: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882405.58319: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882405.58363: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882405.58397: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882405.58488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882405.58523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882405.58564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882405.58613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882405.58636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882405.58790: variable 'ansible_distribution' from source: facts 13040 1726882405.58801: variable 'ansible_distribution_major_version' from source: facts 13040 1726882405.58821: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882405.58828: when evaluation is False, skipping this task 13040 1726882405.58835: _execute() done 13040 1726882405.58841: dumping result to json 13040 1726882405.58848: done dumping result, returning 13040 1726882405.58866: done running TaskExecutor() for managed_node1/TASK: TEST Add Bond with 2 ports [0e448fcc-3ce9-b123-314b-00000000001e] 13040 1726882405.58878: sending task result for task 0e448fcc-3ce9-b123-314b-00000000001e skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 13040 1726882405.59065: no more pending results, returning what we have 13040 1726882405.59070: results queue empty 13040 1726882405.59070: checking for any_errors_fatal 13040 1726882405.59076: done checking for any_errors_fatal 13040 1726882405.59076: checking for max_fail_percentage 13040 1726882405.59078: done checking for max_fail_percentage 13040 1726882405.59079: checking to see if all hosts have failed and the running result is not ok 13040 1726882405.59079: done checking to see if all hosts have failed 13040 1726882405.59080: getting the remaining hosts for this loop 13040 1726882405.59082: done getting the remaining hosts for this loop 13040 1726882405.59085: getting the next task for host managed_node1 13040 1726882405.59094: done getting next task for host managed_node1 13040 1726882405.59100: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13040 1726882405.59105: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882405.59120: getting variables 13040 1726882405.59122: in VariableManager get_vars() 13040 1726882405.59182: Calling all_inventory to load vars for managed_node1 13040 1726882405.59185: Calling groups_inventory to load vars for managed_node1 13040 1726882405.59188: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882405.59199: Calling all_plugins_play to load vars for managed_node1 13040 1726882405.59202: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882405.59205: Calling groups_plugins_play to load vars for managed_node1 13040 1726882405.59387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882405.59611: done with get_vars() 13040 1726882405.59623: done getting variables 13040 1726882405.59825: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000001e 13040 1726882405.59828: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:33:25 -0400 (0:00:00.056) 0:00:03.075 ****** 13040 1726882405.59843: entering _queue_task() for managed_node1/include_tasks 13040 1726882405.60269: worker is 1 (out of 1 available) 13040 1726882405.60280: exiting _queue_task() for managed_node1/include_tasks 13040 1726882405.60290: done queuing things up, now waiting for results queue to drain 13040 1726882405.60291: waiting for pending results... 13040 1726882405.61256: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13040 1726882405.61429: in run() - task 0e448fcc-3ce9-b123-314b-000000000026 13040 1726882405.61479: variable 'ansible_search_path' from source: unknown 13040 1726882405.61487: variable 'ansible_search_path' from source: unknown 13040 1726882405.61529: calling self._execute() 13040 1726882405.61616: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882405.61636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882405.61654: variable 'omit' from source: magic vars 13040 1726882405.62186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882405.64666: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882405.64743: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882405.64788: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882405.64843: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882405.64876: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882405.64966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882405.65004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882405.65034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882405.65086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882405.65110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882405.65258: variable 'ansible_distribution' from source: facts 13040 1726882405.65274: variable 'ansible_distribution_major_version' from source: facts 13040 1726882405.65297: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882405.65304: when evaluation is False, skipping this task 13040 1726882405.65311: _execute() done 13040 1726882405.65317: dumping result to json 13040 1726882405.65328: done dumping result, returning 13040 1726882405.65340: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-b123-314b-000000000026] 13040 1726882405.65350: sending task result for task 0e448fcc-3ce9-b123-314b-000000000026 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882405.65501: no more pending results, returning what we have 13040 1726882405.65506: results queue empty 13040 1726882405.65507: checking for any_errors_fatal 13040 1726882405.65515: done checking for any_errors_fatal 13040 1726882405.65516: checking for max_fail_percentage 13040 1726882405.65518: done checking for max_fail_percentage 13040 1726882405.65519: checking to see if all hosts have failed and the running result is not ok 13040 1726882405.65519: done checking to see if all hosts have failed 13040 1726882405.65520: getting the remaining hosts for this loop 13040 1726882405.65521: done getting the remaining hosts for this loop 13040 1726882405.65526: getting the next task for host managed_node1 13040 1726882405.65533: done getting next task for host managed_node1 13040 1726882405.65537: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 13040 1726882405.65540: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882405.65555: getting variables 13040 1726882405.65557: in VariableManager get_vars() 13040 1726882405.65615: Calling all_inventory to load vars for managed_node1 13040 1726882405.65618: Calling groups_inventory to load vars for managed_node1 13040 1726882405.65620: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882405.65632: Calling all_plugins_play to load vars for managed_node1 13040 1726882405.65635: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882405.65638: Calling groups_plugins_play to load vars for managed_node1 13040 1726882405.65867: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882405.66087: done with get_vars() 13040 1726882405.66211: done getting variables 13040 1726882405.66243: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000026 13040 1726882405.66246: WORKER PROCESS EXITING 13040 1726882405.66284: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:33:25 -0400 (0:00:00.065) 0:00:03.141 ****** 13040 1726882405.66431: entering _queue_task() for managed_node1/debug 13040 1726882405.66751: worker is 1 (out of 1 available) 13040 1726882405.66765: exiting _queue_task() for managed_node1/debug 13040 1726882405.66776: done queuing things up, now waiting for results queue to drain 13040 1726882405.66778: waiting for pending results... 13040 1726882405.67682: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 13040 1726882405.67979: in run() - task 0e448fcc-3ce9-b123-314b-000000000027 13040 1726882405.68080: variable 'ansible_search_path' from source: unknown 13040 1726882405.68088: variable 'ansible_search_path' from source: unknown 13040 1726882405.68149: calling self._execute() 13040 1726882405.68366: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882405.68379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882405.68408: variable 'omit' from source: magic vars 13040 1726882405.68889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882405.72943: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882405.73023: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882405.73152: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882405.73194: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882405.73250: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882405.73409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882405.73476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882405.73579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882405.73624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882405.73679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882405.74040: variable 'ansible_distribution' from source: facts 13040 1726882405.74051: variable 'ansible_distribution_major_version' from source: facts 13040 1726882405.74076: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882405.74083: when evaluation is False, skipping this task 13040 1726882405.74089: _execute() done 13040 1726882405.74095: dumping result to json 13040 1726882405.74102: done dumping result, returning 13040 1726882405.74116: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-b123-314b-000000000027] 13040 1726882405.74129: sending task result for task 0e448fcc-3ce9-b123-314b-000000000027 skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 13040 1726882405.74276: no more pending results, returning what we have 13040 1726882405.74280: results queue empty 13040 1726882405.74281: checking for any_errors_fatal 13040 1726882405.74286: done checking for any_errors_fatal 13040 1726882405.74287: checking for max_fail_percentage 13040 1726882405.74289: done checking for max_fail_percentage 13040 1726882405.74290: checking to see if all hosts have failed and the running result is not ok 13040 1726882405.74291: done checking to see if all hosts have failed 13040 1726882405.74291: getting the remaining hosts for this loop 13040 1726882405.74293: done getting the remaining hosts for this loop 13040 1726882405.74297: getting the next task for host managed_node1 13040 1726882405.74303: done getting next task for host managed_node1 13040 1726882405.74308: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13040 1726882405.74310: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882405.74323: getting variables 13040 1726882405.74326: in VariableManager get_vars() 13040 1726882405.74387: Calling all_inventory to load vars for managed_node1 13040 1726882405.74390: Calling groups_inventory to load vars for managed_node1 13040 1726882405.74393: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882405.74430: Calling all_plugins_play to load vars for managed_node1 13040 1726882405.74434: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882405.74438: Calling groups_plugins_play to load vars for managed_node1 13040 1726882405.74994: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882405.75244: done with get_vars() 13040 1726882405.75258: done getting variables 13040 1726882405.75352: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:33:25 -0400 (0:00:00.089) 0:00:03.231 ****** 13040 1726882405.75390: entering _queue_task() for managed_node1/fail 13040 1726882405.75393: Creating lock for fail 13040 1726882405.75438: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000027 13040 1726882405.75447: WORKER PROCESS EXITING 13040 1726882405.75935: worker is 1 (out of 1 available) 13040 1726882405.75947: exiting _queue_task() for managed_node1/fail 13040 1726882405.75959: done queuing things up, now waiting for results queue to drain 13040 1726882405.75961: waiting for pending results... 13040 1726882405.76211: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13040 1726882405.76347: in run() - task 0e448fcc-3ce9-b123-314b-000000000028 13040 1726882405.76370: variable 'ansible_search_path' from source: unknown 13040 1726882405.76378: variable 'ansible_search_path' from source: unknown 13040 1726882405.76422: calling self._execute() 13040 1726882405.76504: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882405.76519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882405.76634: variable 'omit' from source: magic vars 13040 1726882405.77571: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882405.80362: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882405.80444: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882405.81292: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882405.81334: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882405.81374: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882405.81456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882405.81504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882405.81535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882405.81603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882405.81624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882405.81876: variable 'ansible_distribution' from source: facts 13040 1726882405.81892: variable 'ansible_distribution_major_version' from source: facts 13040 1726882405.81913: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882405.81920: when evaluation is False, skipping this task 13040 1726882405.81927: _execute() done 13040 1726882405.81933: dumping result to json 13040 1726882405.81940: done dumping result, returning 13040 1726882405.81951: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-b123-314b-000000000028] 13040 1726882405.81962: sending task result for task 0e448fcc-3ce9-b123-314b-000000000028 13040 1726882405.82075: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000028 13040 1726882405.82082: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882405.82141: no more pending results, returning what we have 13040 1726882405.82146: results queue empty 13040 1726882405.82147: checking for any_errors_fatal 13040 1726882405.82153: done checking for any_errors_fatal 13040 1726882405.82154: checking for max_fail_percentage 13040 1726882405.82155: done checking for max_fail_percentage 13040 1726882405.82156: checking to see if all hosts have failed and the running result is not ok 13040 1726882405.82157: done checking to see if all hosts have failed 13040 1726882405.82158: getting the remaining hosts for this loop 13040 1726882405.82160: done getting the remaining hosts for this loop 13040 1726882405.82165: getting the next task for host managed_node1 13040 1726882405.82173: done getting next task for host managed_node1 13040 1726882405.82178: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13040 1726882405.82181: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882405.82194: getting variables 13040 1726882405.82196: in VariableManager get_vars() 13040 1726882405.82253: Calling all_inventory to load vars for managed_node1 13040 1726882405.82256: Calling groups_inventory to load vars for managed_node1 13040 1726882405.82259: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882405.82274: Calling all_plugins_play to load vars for managed_node1 13040 1726882405.82278: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882405.82281: Calling groups_plugins_play to load vars for managed_node1 13040 1726882405.82503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882405.82710: done with get_vars() 13040 1726882405.82720: done getting variables 13040 1726882405.83033: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:33:25 -0400 (0:00:00.076) 0:00:03.307 ****** 13040 1726882405.83068: entering _queue_task() for managed_node1/fail 13040 1726882405.83302: worker is 1 (out of 1 available) 13040 1726882405.83314: exiting _queue_task() for managed_node1/fail 13040 1726882405.83326: done queuing things up, now waiting for results queue to drain 13040 1726882405.83327: waiting for pending results... 13040 1726882405.84239: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13040 1726882405.84430: in run() - task 0e448fcc-3ce9-b123-314b-000000000029 13040 1726882405.84473: variable 'ansible_search_path' from source: unknown 13040 1726882405.84483: variable 'ansible_search_path' from source: unknown 13040 1726882405.84524: calling self._execute() 13040 1726882405.84612: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882405.84632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882405.84652: variable 'omit' from source: magic vars 13040 1726882405.85086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882405.88211: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882405.88392: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882405.88439: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882405.88485: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882405.88516: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882405.88611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882405.88645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882405.88681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882405.88725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882405.88743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882405.88891: variable 'ansible_distribution' from source: facts 13040 1726882405.88902: variable 'ansible_distribution_major_version' from source: facts 13040 1726882405.88924: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882405.88931: when evaluation is False, skipping this task 13040 1726882405.88938: _execute() done 13040 1726882405.88943: dumping result to json 13040 1726882405.88950: done dumping result, returning 13040 1726882405.88961: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-b123-314b-000000000029] 13040 1726882405.88974: sending task result for task 0e448fcc-3ce9-b123-314b-000000000029 13040 1726882405.89090: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000029 13040 1726882405.89099: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882405.89147: no more pending results, returning what we have 13040 1726882405.89151: results queue empty 13040 1726882405.89152: checking for any_errors_fatal 13040 1726882405.89157: done checking for any_errors_fatal 13040 1726882405.89158: checking for max_fail_percentage 13040 1726882405.89160: done checking for max_fail_percentage 13040 1726882405.89161: checking to see if all hosts have failed and the running result is not ok 13040 1726882405.89162: done checking to see if all hosts have failed 13040 1726882405.89163: getting the remaining hosts for this loop 13040 1726882405.89166: done getting the remaining hosts for this loop 13040 1726882405.89170: getting the next task for host managed_node1 13040 1726882405.89177: done getting next task for host managed_node1 13040 1726882405.89181: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13040 1726882405.89184: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882405.89199: getting variables 13040 1726882405.89201: in VariableManager get_vars() 13040 1726882405.89263: Calling all_inventory to load vars for managed_node1 13040 1726882405.89268: Calling groups_inventory to load vars for managed_node1 13040 1726882405.89271: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882405.89283: Calling all_plugins_play to load vars for managed_node1 13040 1726882405.89286: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882405.89289: Calling groups_plugins_play to load vars for managed_node1 13040 1726882405.89473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882405.89662: done with get_vars() 13040 1726882405.89676: done getting variables 13040 1726882405.89733: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:33:25 -0400 (0:00:00.067) 0:00:03.374 ****** 13040 1726882405.90028: entering _queue_task() for managed_node1/fail 13040 1726882405.90268: worker is 1 (out of 1 available) 13040 1726882405.90279: exiting _queue_task() for managed_node1/fail 13040 1726882405.90292: done queuing things up, now waiting for results queue to drain 13040 1726882405.90293: waiting for pending results... 13040 1726882405.90546: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13040 1726882405.90686: in run() - task 0e448fcc-3ce9-b123-314b-00000000002a 13040 1726882405.90703: variable 'ansible_search_path' from source: unknown 13040 1726882405.90711: variable 'ansible_search_path' from source: unknown 13040 1726882405.90753: calling self._execute() 13040 1726882405.90837: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882405.90850: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882405.90865: variable 'omit' from source: magic vars 13040 1726882405.91359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882405.93734: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882405.93810: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882405.94084: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882405.94126: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882405.94156: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882405.94239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882405.94275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882405.94305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882405.94354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882405.94375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882405.94512: variable 'ansible_distribution' from source: facts 13040 1726882405.94522: variable 'ansible_distribution_major_version' from source: facts 13040 1726882405.94547: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882405.94554: when evaluation is False, skipping this task 13040 1726882405.94561: _execute() done 13040 1726882405.94570: dumping result to json 13040 1726882405.94578: done dumping result, returning 13040 1726882405.94590: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-b123-314b-00000000002a] 13040 1726882405.94600: sending task result for task 0e448fcc-3ce9-b123-314b-00000000002a skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882405.94744: no more pending results, returning what we have 13040 1726882405.94749: results queue empty 13040 1726882405.94750: checking for any_errors_fatal 13040 1726882405.94759: done checking for any_errors_fatal 13040 1726882405.94760: checking for max_fail_percentage 13040 1726882405.94761: done checking for max_fail_percentage 13040 1726882405.94762: checking to see if all hosts have failed and the running result is not ok 13040 1726882405.94765: done checking to see if all hosts have failed 13040 1726882405.94766: getting the remaining hosts for this loop 13040 1726882405.94767: done getting the remaining hosts for this loop 13040 1726882405.94771: getting the next task for host managed_node1 13040 1726882405.94778: done getting next task for host managed_node1 13040 1726882405.94783: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13040 1726882405.94786: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882405.94799: getting variables 13040 1726882405.94801: in VariableManager get_vars() 13040 1726882405.94857: Calling all_inventory to load vars for managed_node1 13040 1726882405.94860: Calling groups_inventory to load vars for managed_node1 13040 1726882405.94863: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882405.94875: Calling all_plugins_play to load vars for managed_node1 13040 1726882405.94878: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882405.94881: Calling groups_plugins_play to load vars for managed_node1 13040 1726882405.95102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882405.95318: done with get_vars() 13040 1726882405.95329: done getting variables 13040 1726882405.95654: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000002a 13040 1726882405.95657: WORKER PROCESS EXITING 13040 1726882405.95722: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:33:25 -0400 (0:00:00.057) 0:00:03.434 ****** 13040 1726882405.95753: entering _queue_task() for managed_node1/dnf 13040 1726882405.95988: worker is 1 (out of 1 available) 13040 1726882405.95999: exiting _queue_task() for managed_node1/dnf 13040 1726882405.96011: done queuing things up, now waiting for results queue to drain 13040 1726882405.96012: waiting for pending results... 13040 1726882405.96268: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13040 1726882405.96396: in run() - task 0e448fcc-3ce9-b123-314b-00000000002b 13040 1726882405.96414: variable 'ansible_search_path' from source: unknown 13040 1726882405.96421: variable 'ansible_search_path' from source: unknown 13040 1726882405.96466: calling self._execute() 13040 1726882405.96549: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882405.96565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882405.96583: variable 'omit' from source: magic vars 13040 1726882405.96995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882405.99666: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882405.99741: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882405.99783: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882405.99823: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882405.99852: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882405.99942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882405.99977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882406.00007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882406.00055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882406.00076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882406.00212: variable 'ansible_distribution' from source: facts 13040 1726882406.00223: variable 'ansible_distribution_major_version' from source: facts 13040 1726882406.00243: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882406.00250: when evaluation is False, skipping this task 13040 1726882406.00261: _execute() done 13040 1726882406.00270: dumping result to json 13040 1726882406.00277: done dumping result, returning 13040 1726882406.00289: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-b123-314b-00000000002b] 13040 1726882406.00299: sending task result for task 0e448fcc-3ce9-b123-314b-00000000002b skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882406.00450: no more pending results, returning what we have 13040 1726882406.00454: results queue empty 13040 1726882406.00455: checking for any_errors_fatal 13040 1726882406.00460: done checking for any_errors_fatal 13040 1726882406.00460: checking for max_fail_percentage 13040 1726882406.00463: done checking for max_fail_percentage 13040 1726882406.00465: checking to see if all hosts have failed and the running result is not ok 13040 1726882406.00466: done checking to see if all hosts have failed 13040 1726882406.00466: getting the remaining hosts for this loop 13040 1726882406.00468: done getting the remaining hosts for this loop 13040 1726882406.00472: getting the next task for host managed_node1 13040 1726882406.00479: done getting next task for host managed_node1 13040 1726882406.00483: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13040 1726882406.00486: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882406.00500: getting variables 13040 1726882406.00502: in VariableManager get_vars() 13040 1726882406.00558: Calling all_inventory to load vars for managed_node1 13040 1726882406.00560: Calling groups_inventory to load vars for managed_node1 13040 1726882406.00564: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882406.00575: Calling all_plugins_play to load vars for managed_node1 13040 1726882406.00578: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882406.00581: Calling groups_plugins_play to load vars for managed_node1 13040 1726882406.00749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882406.00964: done with get_vars() 13040 1726882406.00976: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13040 1726882406.01050: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13040 1726882406.01345: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000002b 13040 1726882406.01349: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:33:26 -0400 (0:00:00.056) 0:00:03.490 ****** 13040 1726882406.01362: entering _queue_task() for managed_node1/yum 13040 1726882406.01366: Creating lock for yum 13040 1726882406.01608: worker is 1 (out of 1 available) 13040 1726882406.01620: exiting _queue_task() for managed_node1/yum 13040 1726882406.01630: done queuing things up, now waiting for results queue to drain 13040 1726882406.01632: waiting for pending results... 13040 1726882406.01905: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13040 1726882406.02037: in run() - task 0e448fcc-3ce9-b123-314b-00000000002c 13040 1726882406.02055: variable 'ansible_search_path' from source: unknown 13040 1726882406.02063: variable 'ansible_search_path' from source: unknown 13040 1726882406.02105: calling self._execute() 13040 1726882406.02271: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882406.02284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882406.02300: variable 'omit' from source: magic vars 13040 1726882406.02710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882406.06233: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882406.06310: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882406.06352: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882406.06409: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882406.06439: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882406.06521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882406.06554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882406.06586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882406.06634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882406.06651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882406.06917: variable 'ansible_distribution' from source: facts 13040 1726882406.06929: variable 'ansible_distribution_major_version' from source: facts 13040 1726882406.06951: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882406.06959: when evaluation is False, skipping this task 13040 1726882406.06967: _execute() done 13040 1726882406.06973: dumping result to json 13040 1726882406.06981: done dumping result, returning 13040 1726882406.06991: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-b123-314b-00000000002c] 13040 1726882406.07001: sending task result for task 0e448fcc-3ce9-b123-314b-00000000002c 13040 1726882406.07117: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000002c skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882406.07239: no more pending results, returning what we have 13040 1726882406.07243: results queue empty 13040 1726882406.07244: checking for any_errors_fatal 13040 1726882406.07249: done checking for any_errors_fatal 13040 1726882406.07250: checking for max_fail_percentage 13040 1726882406.07251: done checking for max_fail_percentage 13040 1726882406.07252: checking to see if all hosts have failed and the running result is not ok 13040 1726882406.07253: done checking to see if all hosts have failed 13040 1726882406.07253: getting the remaining hosts for this loop 13040 1726882406.07255: done getting the remaining hosts for this loop 13040 1726882406.07259: getting the next task for host managed_node1 13040 1726882406.07267: done getting next task for host managed_node1 13040 1726882406.07271: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13040 1726882406.07274: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882406.07286: getting variables 13040 1726882406.07288: in VariableManager get_vars() 13040 1726882406.07335: Calling all_inventory to load vars for managed_node1 13040 1726882406.07338: Calling groups_inventory to load vars for managed_node1 13040 1726882406.07341: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882406.07351: Calling all_plugins_play to load vars for managed_node1 13040 1726882406.07354: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882406.07357: Calling groups_plugins_play to load vars for managed_node1 13040 1726882406.07510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882406.07717: done with get_vars() 13040 1726882406.07727: done getting variables 13040 1726882406.07789: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:33:26 -0400 (0:00:00.064) 0:00:03.555 ****** 13040 1726882406.07824: entering _queue_task() for managed_node1/fail 13040 1726882406.07842: WORKER PROCESS EXITING 13040 1726882406.08353: worker is 1 (out of 1 available) 13040 1726882406.08367: exiting _queue_task() for managed_node1/fail 13040 1726882406.08379: done queuing things up, now waiting for results queue to drain 13040 1726882406.08381: waiting for pending results... 13040 1726882406.08641: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13040 1726882406.08777: in run() - task 0e448fcc-3ce9-b123-314b-00000000002d 13040 1726882406.08798: variable 'ansible_search_path' from source: unknown 13040 1726882406.08806: variable 'ansible_search_path' from source: unknown 13040 1726882406.08850: calling self._execute() 13040 1726882406.08936: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882406.08947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882406.08960: variable 'omit' from source: magic vars 13040 1726882406.09390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882406.12791: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882406.12878: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882406.12919: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882406.13071: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882406.13101: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882406.13181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882406.13214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882406.13243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882406.13294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882406.13315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882406.13561: variable 'ansible_distribution' from source: facts 13040 1726882406.13598: variable 'ansible_distribution_major_version' from source: facts 13040 1726882406.13620: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882406.13704: when evaluation is False, skipping this task 13040 1726882406.13712: _execute() done 13040 1726882406.13719: dumping result to json 13040 1726882406.13726: done dumping result, returning 13040 1726882406.13737: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-b123-314b-00000000002d] 13040 1726882406.13746: sending task result for task 0e448fcc-3ce9-b123-314b-00000000002d skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882406.13901: no more pending results, returning what we have 13040 1726882406.13905: results queue empty 13040 1726882406.13906: checking for any_errors_fatal 13040 1726882406.13913: done checking for any_errors_fatal 13040 1726882406.13914: checking for max_fail_percentage 13040 1726882406.13916: done checking for max_fail_percentage 13040 1726882406.13917: checking to see if all hosts have failed and the running result is not ok 13040 1726882406.13918: done checking to see if all hosts have failed 13040 1726882406.13919: getting the remaining hosts for this loop 13040 1726882406.13920: done getting the remaining hosts for this loop 13040 1726882406.13924: getting the next task for host managed_node1 13040 1726882406.13931: done getting next task for host managed_node1 13040 1726882406.13935: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 13040 1726882406.13938: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882406.13952: getting variables 13040 1726882406.13954: in VariableManager get_vars() 13040 1726882406.14011: Calling all_inventory to load vars for managed_node1 13040 1726882406.14014: Calling groups_inventory to load vars for managed_node1 13040 1726882406.14017: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882406.14028: Calling all_plugins_play to load vars for managed_node1 13040 1726882406.14030: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882406.14033: Calling groups_plugins_play to load vars for managed_node1 13040 1726882406.14204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882406.14441: done with get_vars() 13040 1726882406.14452: done getting variables 13040 1726882406.15250: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000002d 13040 1726882406.15253: WORKER PROCESS EXITING 13040 1726882406.15291: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:33:26 -0400 (0:00:00.074) 0:00:03.630 ****** 13040 1726882406.15325: entering _queue_task() for managed_node1/package 13040 1726882406.15559: worker is 1 (out of 1 available) 13040 1726882406.15573: exiting _queue_task() for managed_node1/package 13040 1726882406.15586: done queuing things up, now waiting for results queue to drain 13040 1726882406.15587: waiting for pending results... 13040 1726882406.15907: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 13040 1726882406.16044: in run() - task 0e448fcc-3ce9-b123-314b-00000000002e 13040 1726882406.16068: variable 'ansible_search_path' from source: unknown 13040 1726882406.16078: variable 'ansible_search_path' from source: unknown 13040 1726882406.16117: calling self._execute() 13040 1726882406.16204: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882406.16216: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882406.16230: variable 'omit' from source: magic vars 13040 1726882406.16667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882406.19962: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882406.20040: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882406.20097: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882406.20139: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882406.20171: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882406.20253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882406.20290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882406.20320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882406.20370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882406.20389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882406.20527: variable 'ansible_distribution' from source: facts 13040 1726882406.20537: variable 'ansible_distribution_major_version' from source: facts 13040 1726882406.20565: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882406.20573: when evaluation is False, skipping this task 13040 1726882406.20580: _execute() done 13040 1726882406.20585: dumping result to json 13040 1726882406.20592: done dumping result, returning 13040 1726882406.20602: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-b123-314b-00000000002e] 13040 1726882406.20612: sending task result for task 0e448fcc-3ce9-b123-314b-00000000002e skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882406.20770: no more pending results, returning what we have 13040 1726882406.20774: results queue empty 13040 1726882406.20775: checking for any_errors_fatal 13040 1726882406.20782: done checking for any_errors_fatal 13040 1726882406.20782: checking for max_fail_percentage 13040 1726882406.20784: done checking for max_fail_percentage 13040 1726882406.20785: checking to see if all hosts have failed and the running result is not ok 13040 1726882406.20786: done checking to see if all hosts have failed 13040 1726882406.20787: getting the remaining hosts for this loop 13040 1726882406.20788: done getting the remaining hosts for this loop 13040 1726882406.20792: getting the next task for host managed_node1 13040 1726882406.20798: done getting next task for host managed_node1 13040 1726882406.20802: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13040 1726882406.20805: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882406.20819: getting variables 13040 1726882406.20821: in VariableManager get_vars() 13040 1726882406.20878: Calling all_inventory to load vars for managed_node1 13040 1726882406.20881: Calling groups_inventory to load vars for managed_node1 13040 1726882406.20884: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882406.20895: Calling all_plugins_play to load vars for managed_node1 13040 1726882406.20898: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882406.20901: Calling groups_plugins_play to load vars for managed_node1 13040 1726882406.21073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882406.21281: done with get_vars() 13040 1726882406.21293: done getting variables 13040 1726882406.21352: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13040 1726882406.21485: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000002e 13040 1726882406.21489: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:33:26 -0400 (0:00:00.061) 0:00:03.692 ****** 13040 1726882406.21504: entering _queue_task() for managed_node1/package 13040 1726882406.21981: worker is 1 (out of 1 available) 13040 1726882406.21993: exiting _queue_task() for managed_node1/package 13040 1726882406.22005: done queuing things up, now waiting for results queue to drain 13040 1726882406.22006: waiting for pending results... 13040 1726882406.22259: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13040 1726882406.22398: in run() - task 0e448fcc-3ce9-b123-314b-00000000002f 13040 1726882406.22417: variable 'ansible_search_path' from source: unknown 13040 1726882406.22426: variable 'ansible_search_path' from source: unknown 13040 1726882406.22471: calling self._execute() 13040 1726882406.22553: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882406.22569: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882406.22585: variable 'omit' from source: magic vars 13040 1726882406.23010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882406.28023: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882406.28108: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882406.28157: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882406.28202: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882406.28234: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882406.28325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882406.28369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882406.28403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882406.28452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882406.28478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882406.28610: variable 'ansible_distribution' from source: facts 13040 1726882406.28621: variable 'ansible_distribution_major_version' from source: facts 13040 1726882406.28642: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882406.28649: when evaluation is False, skipping this task 13040 1726882406.28655: _execute() done 13040 1726882406.28661: dumping result to json 13040 1726882406.28671: done dumping result, returning 13040 1726882406.28685: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-b123-314b-00000000002f] 13040 1726882406.28694: sending task result for task 0e448fcc-3ce9-b123-314b-00000000002f skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882406.28840: no more pending results, returning what we have 13040 1726882406.28845: results queue empty 13040 1726882406.28846: checking for any_errors_fatal 13040 1726882406.28852: done checking for any_errors_fatal 13040 1726882406.28853: checking for max_fail_percentage 13040 1726882406.28855: done checking for max_fail_percentage 13040 1726882406.28856: checking to see if all hosts have failed and the running result is not ok 13040 1726882406.28856: done checking to see if all hosts have failed 13040 1726882406.28857: getting the remaining hosts for this loop 13040 1726882406.28859: done getting the remaining hosts for this loop 13040 1726882406.28863: getting the next task for host managed_node1 13040 1726882406.28872: done getting next task for host managed_node1 13040 1726882406.28876: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13040 1726882406.28879: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882406.28893: getting variables 13040 1726882406.28894: in VariableManager get_vars() 13040 1726882406.28950: Calling all_inventory to load vars for managed_node1 13040 1726882406.28953: Calling groups_inventory to load vars for managed_node1 13040 1726882406.28956: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882406.28968: Calling all_plugins_play to load vars for managed_node1 13040 1726882406.28971: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882406.28974: Calling groups_plugins_play to load vars for managed_node1 13040 1726882406.29616: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882406.29796: done with get_vars() 13040 1726882406.29807: done getting variables 13040 1726882406.30149: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13040 1726882406.30178: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000002f 13040 1726882406.30181: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:33:26 -0400 (0:00:00.086) 0:00:03.779 ****** 13040 1726882406.30194: entering _queue_task() for managed_node1/package 13040 1726882406.30433: worker is 1 (out of 1 available) 13040 1726882406.30444: exiting _queue_task() for managed_node1/package 13040 1726882406.30455: done queuing things up, now waiting for results queue to drain 13040 1726882406.30457: waiting for pending results... 13040 1726882406.31302: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13040 1726882406.31823: in run() - task 0e448fcc-3ce9-b123-314b-000000000030 13040 1726882406.31843: variable 'ansible_search_path' from source: unknown 13040 1726882406.31856: variable 'ansible_search_path' from source: unknown 13040 1726882406.31901: calling self._execute() 13040 1726882406.32026: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882406.32039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882406.32056: variable 'omit' from source: magic vars 13040 1726882406.32550: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882406.35046: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882406.35141: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882406.35188: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882406.35230: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882406.35261: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882406.35348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882406.35384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882406.35414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882406.35466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882406.35486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882406.35628: variable 'ansible_distribution' from source: facts 13040 1726882406.35639: variable 'ansible_distribution_major_version' from source: facts 13040 1726882406.35670: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882406.35677: when evaluation is False, skipping this task 13040 1726882406.35685: _execute() done 13040 1726882406.35692: dumping result to json 13040 1726882406.35699: done dumping result, returning 13040 1726882406.35710: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-b123-314b-000000000030] 13040 1726882406.35719: sending task result for task 0e448fcc-3ce9-b123-314b-000000000030 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882406.35880: no more pending results, returning what we have 13040 1726882406.35884: results queue empty 13040 1726882406.35884: checking for any_errors_fatal 13040 1726882406.35893: done checking for any_errors_fatal 13040 1726882406.35894: checking for max_fail_percentage 13040 1726882406.35897: done checking for max_fail_percentage 13040 1726882406.35898: checking to see if all hosts have failed and the running result is not ok 13040 1726882406.35900: done checking to see if all hosts have failed 13040 1726882406.35901: getting the remaining hosts for this loop 13040 1726882406.35902: done getting the remaining hosts for this loop 13040 1726882406.35906: getting the next task for host managed_node1 13040 1726882406.35913: done getting next task for host managed_node1 13040 1726882406.35918: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13040 1726882406.35921: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882406.35935: getting variables 13040 1726882406.35937: in VariableManager get_vars() 13040 1726882406.35997: Calling all_inventory to load vars for managed_node1 13040 1726882406.36000: Calling groups_inventory to load vars for managed_node1 13040 1726882406.36003: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882406.36014: Calling all_plugins_play to load vars for managed_node1 13040 1726882406.36017: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882406.36020: Calling groups_plugins_play to load vars for managed_node1 13040 1726882406.36201: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882406.36413: done with get_vars() 13040 1726882406.36426: done getting variables 13040 1726882406.36747: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000030 13040 1726882406.36750: WORKER PROCESS EXITING 13040 1726882406.36817: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:33:26 -0400 (0:00:00.066) 0:00:03.845 ****** 13040 1726882406.36849: entering _queue_task() for managed_node1/service 13040 1726882406.36851: Creating lock for service 13040 1726882406.37099: worker is 1 (out of 1 available) 13040 1726882406.37110: exiting _queue_task() for managed_node1/service 13040 1726882406.37121: done queuing things up, now waiting for results queue to drain 13040 1726882406.37123: waiting for pending results... 13040 1726882406.37382: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13040 1726882406.37518: in run() - task 0e448fcc-3ce9-b123-314b-000000000031 13040 1726882406.37537: variable 'ansible_search_path' from source: unknown 13040 1726882406.37546: variable 'ansible_search_path' from source: unknown 13040 1726882406.37590: calling self._execute() 13040 1726882406.37678: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882406.37688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882406.37703: variable 'omit' from source: magic vars 13040 1726882406.38131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882406.40581: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882406.40656: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882406.40702: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882406.40739: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882406.40769: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882406.40848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882406.40883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882406.40916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882406.40960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882406.40982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882406.41119: variable 'ansible_distribution' from source: facts 13040 1726882406.41134: variable 'ansible_distribution_major_version' from source: facts 13040 1726882406.41155: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882406.41162: when evaluation is False, skipping this task 13040 1726882406.41172: _execute() done 13040 1726882406.41178: dumping result to json 13040 1726882406.41185: done dumping result, returning 13040 1726882406.41195: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-b123-314b-000000000031] 13040 1726882406.41204: sending task result for task 0e448fcc-3ce9-b123-314b-000000000031 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882406.41348: no more pending results, returning what we have 13040 1726882406.41352: results queue empty 13040 1726882406.41353: checking for any_errors_fatal 13040 1726882406.41361: done checking for any_errors_fatal 13040 1726882406.41362: checking for max_fail_percentage 13040 1726882406.41365: done checking for max_fail_percentage 13040 1726882406.41366: checking to see if all hosts have failed and the running result is not ok 13040 1726882406.41367: done checking to see if all hosts have failed 13040 1726882406.41368: getting the remaining hosts for this loop 13040 1726882406.41369: done getting the remaining hosts for this loop 13040 1726882406.41373: getting the next task for host managed_node1 13040 1726882406.41379: done getting next task for host managed_node1 13040 1726882406.41384: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13040 1726882406.41387: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882406.41400: getting variables 13040 1726882406.41402: in VariableManager get_vars() 13040 1726882406.41459: Calling all_inventory to load vars for managed_node1 13040 1726882406.41464: Calling groups_inventory to load vars for managed_node1 13040 1726882406.41467: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882406.41478: Calling all_plugins_play to load vars for managed_node1 13040 1726882406.41481: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882406.41484: Calling groups_plugins_play to load vars for managed_node1 13040 1726882406.41707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882406.41922: done with get_vars() 13040 1726882406.41932: done getting variables 13040 1726882406.42242: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000031 13040 1726882406.42245: WORKER PROCESS EXITING 13040 1726882406.42273: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:33:26 -0400 (0:00:00.054) 0:00:03.900 ****** 13040 1726882406.42303: entering _queue_task() for managed_node1/service 13040 1726882406.42532: worker is 1 (out of 1 available) 13040 1726882406.42545: exiting _queue_task() for managed_node1/service 13040 1726882406.42557: done queuing things up, now waiting for results queue to drain 13040 1726882406.42558: waiting for pending results... 13040 1726882406.42812: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13040 1726882406.42944: in run() - task 0e448fcc-3ce9-b123-314b-000000000032 13040 1726882406.42965: variable 'ansible_search_path' from source: unknown 13040 1726882406.42973: variable 'ansible_search_path' from source: unknown 13040 1726882406.43014: calling self._execute() 13040 1726882406.43099: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882406.43111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882406.43126: variable 'omit' from source: magic vars 13040 1726882406.43547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882406.46034: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882406.46117: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882406.46162: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882406.46202: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882406.46235: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882406.46590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882406.46622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882406.46653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882406.46703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882406.46722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882406.46867: variable 'ansible_distribution' from source: facts 13040 1726882406.46882: variable 'ansible_distribution_major_version' from source: facts 13040 1726882406.46904: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882406.46947: when evaluation is False, skipping this task 13040 1726882406.46954: _execute() done 13040 1726882406.46961: dumping result to json 13040 1726882406.46970: done dumping result, returning 13040 1726882406.46983: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-b123-314b-000000000032] 13040 1726882406.46995: sending task result for task 0e448fcc-3ce9-b123-314b-000000000032 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13040 1726882406.47157: no more pending results, returning what we have 13040 1726882406.47161: results queue empty 13040 1726882406.47162: checking for any_errors_fatal 13040 1726882406.47170: done checking for any_errors_fatal 13040 1726882406.47170: checking for max_fail_percentage 13040 1726882406.47172: done checking for max_fail_percentage 13040 1726882406.47173: checking to see if all hosts have failed and the running result is not ok 13040 1726882406.47174: done checking to see if all hosts have failed 13040 1726882406.47175: getting the remaining hosts for this loop 13040 1726882406.47177: done getting the remaining hosts for this loop 13040 1726882406.47180: getting the next task for host managed_node1 13040 1726882406.47188: done getting next task for host managed_node1 13040 1726882406.47193: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13040 1726882406.47195: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882406.47210: getting variables 13040 1726882406.47212: in VariableManager get_vars() 13040 1726882406.47269: Calling all_inventory to load vars for managed_node1 13040 1726882406.47272: Calling groups_inventory to load vars for managed_node1 13040 1726882406.47275: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882406.47285: Calling all_plugins_play to load vars for managed_node1 13040 1726882406.47288: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882406.47291: Calling groups_plugins_play to load vars for managed_node1 13040 1726882406.47470: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882406.47698: done with get_vars() 13040 1726882406.47710: done getting variables 13040 1726882406.48375: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13040 1726882406.48406: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000032 13040 1726882406.48409: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:33:26 -0400 (0:00:00.061) 0:00:03.961 ****** 13040 1726882406.48425: entering _queue_task() for managed_node1/service 13040 1726882406.48856: worker is 1 (out of 1 available) 13040 1726882406.48867: exiting _queue_task() for managed_node1/service 13040 1726882406.48879: done queuing things up, now waiting for results queue to drain 13040 1726882406.48881: waiting for pending results... 13040 1726882406.49539: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13040 1726882406.49685: in run() - task 0e448fcc-3ce9-b123-314b-000000000033 13040 1726882406.49707: variable 'ansible_search_path' from source: unknown 13040 1726882406.49714: variable 'ansible_search_path' from source: unknown 13040 1726882406.49754: calling self._execute() 13040 1726882406.49844: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882406.49854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882406.49871: variable 'omit' from source: magic vars 13040 1726882406.50310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882406.54222: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882406.54300: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882406.54340: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882406.54381: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882406.54416: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882406.54496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882406.54530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882406.54559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882406.54611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882406.54629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882406.54800: variable 'ansible_distribution' from source: facts 13040 1726882406.54811: variable 'ansible_distribution_major_version' from source: facts 13040 1726882406.54836: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882406.54842: when evaluation is False, skipping this task 13040 1726882406.54848: _execute() done 13040 1726882406.54853: dumping result to json 13040 1726882406.54860: done dumping result, returning 13040 1726882406.54874: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-b123-314b-000000000033] 13040 1726882406.54885: sending task result for task 0e448fcc-3ce9-b123-314b-000000000033 13040 1726882406.55015: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000033 13040 1726882406.55022: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882406.55097: no more pending results, returning what we have 13040 1726882406.55102: results queue empty 13040 1726882406.55102: checking for any_errors_fatal 13040 1726882406.55108: done checking for any_errors_fatal 13040 1726882406.55109: checking for max_fail_percentage 13040 1726882406.55111: done checking for max_fail_percentage 13040 1726882406.55112: checking to see if all hosts have failed and the running result is not ok 13040 1726882406.55113: done checking to see if all hosts have failed 13040 1726882406.55114: getting the remaining hosts for this loop 13040 1726882406.55115: done getting the remaining hosts for this loop 13040 1726882406.55119: getting the next task for host managed_node1 13040 1726882406.55158: done getting next task for host managed_node1 13040 1726882406.55180: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 13040 1726882406.55183: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882406.55198: getting variables 13040 1726882406.55200: in VariableManager get_vars() 13040 1726882406.55280: Calling all_inventory to load vars for managed_node1 13040 1726882406.55283: Calling groups_inventory to load vars for managed_node1 13040 1726882406.55286: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882406.55297: Calling all_plugins_play to load vars for managed_node1 13040 1726882406.55300: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882406.55302: Calling groups_plugins_play to load vars for managed_node1 13040 1726882406.55544: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882406.55757: done with get_vars() 13040 1726882406.55770: done getting variables 13040 1726882406.55828: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:33:26 -0400 (0:00:00.074) 0:00:04.035 ****** 13040 1726882406.55866: entering _queue_task() for managed_node1/service 13040 1726882406.56379: worker is 1 (out of 1 available) 13040 1726882406.56389: exiting _queue_task() for managed_node1/service 13040 1726882406.56402: done queuing things up, now waiting for results queue to drain 13040 1726882406.56404: waiting for pending results... 13040 1726882406.56678: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 13040 1726882406.56802: in run() - task 0e448fcc-3ce9-b123-314b-000000000034 13040 1726882406.56818: variable 'ansible_search_path' from source: unknown 13040 1726882406.56824: variable 'ansible_search_path' from source: unknown 13040 1726882406.56896: calling self._execute() 13040 1726882406.57024: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882406.57033: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882406.57046: variable 'omit' from source: magic vars 13040 1726882406.57593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882406.60195: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882406.60285: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882406.60331: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882406.60373: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882406.60405: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882406.60496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882406.60534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882406.60572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882406.60622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882406.60642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882406.60867: variable 'ansible_distribution' from source: facts 13040 1726882406.60894: variable 'ansible_distribution_major_version' from source: facts 13040 1726882406.60915: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882406.60921: when evaluation is False, skipping this task 13040 1726882406.60927: _execute() done 13040 1726882406.60932: dumping result to json 13040 1726882406.60938: done dumping result, returning 13040 1726882406.61219: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-b123-314b-000000000034] 13040 1726882406.61231: sending task result for task 0e448fcc-3ce9-b123-314b-000000000034 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13040 1726882406.61393: no more pending results, returning what we have 13040 1726882406.61397: results queue empty 13040 1726882406.61398: checking for any_errors_fatal 13040 1726882406.61406: done checking for any_errors_fatal 13040 1726882406.61407: checking for max_fail_percentage 13040 1726882406.61409: done checking for max_fail_percentage 13040 1726882406.61410: checking to see if all hosts have failed and the running result is not ok 13040 1726882406.61411: done checking to see if all hosts have failed 13040 1726882406.61412: getting the remaining hosts for this loop 13040 1726882406.61413: done getting the remaining hosts for this loop 13040 1726882406.61418: getting the next task for host managed_node1 13040 1726882406.61425: done getting next task for host managed_node1 13040 1726882406.61430: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13040 1726882406.61433: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882406.61448: getting variables 13040 1726882406.61450: in VariableManager get_vars() 13040 1726882406.61516: Calling all_inventory to load vars for managed_node1 13040 1726882406.61520: Calling groups_inventory to load vars for managed_node1 13040 1726882406.61523: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882406.61535: Calling all_plugins_play to load vars for managed_node1 13040 1726882406.61538: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882406.61541: Calling groups_plugins_play to load vars for managed_node1 13040 1726882406.61732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882406.61994: done with get_vars() 13040 1726882406.62007: done getting variables 13040 1726882406.62069: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:33:26 -0400 (0:00:00.062) 0:00:04.098 ****** 13040 1726882406.62105: entering _queue_task() for managed_node1/copy 13040 1726882406.62125: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000034 13040 1726882406.62134: WORKER PROCESS EXITING 13040 1726882406.62621: worker is 1 (out of 1 available) 13040 1726882406.62631: exiting _queue_task() for managed_node1/copy 13040 1726882406.62642: done queuing things up, now waiting for results queue to drain 13040 1726882406.62643: waiting for pending results... 13040 1726882406.63232: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13040 1726882406.63394: in run() - task 0e448fcc-3ce9-b123-314b-000000000035 13040 1726882406.63538: variable 'ansible_search_path' from source: unknown 13040 1726882406.63547: variable 'ansible_search_path' from source: unknown 13040 1726882406.63591: calling self._execute() 13040 1726882406.63738: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882406.63843: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882406.63890: variable 'omit' from source: magic vars 13040 1726882406.64593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882406.67955: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882406.68029: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882406.68078: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882406.68123: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882406.68159: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882406.68264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882406.68306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882406.68340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882406.68391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882406.68420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882406.68560: variable 'ansible_distribution' from source: facts 13040 1726882406.68573: variable 'ansible_distribution_major_version' from source: facts 13040 1726882406.68597: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882406.68604: when evaluation is False, skipping this task 13040 1726882406.68611: _execute() done 13040 1726882406.68617: dumping result to json 13040 1726882406.68623: done dumping result, returning 13040 1726882406.68634: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-b123-314b-000000000035] 13040 1726882406.68644: sending task result for task 0e448fcc-3ce9-b123-314b-000000000035 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882406.68799: no more pending results, returning what we have 13040 1726882406.68803: results queue empty 13040 1726882406.68804: checking for any_errors_fatal 13040 1726882406.68810: done checking for any_errors_fatal 13040 1726882406.68811: checking for max_fail_percentage 13040 1726882406.68813: done checking for max_fail_percentage 13040 1726882406.68814: checking to see if all hosts have failed and the running result is not ok 13040 1726882406.68815: done checking to see if all hosts have failed 13040 1726882406.68816: getting the remaining hosts for this loop 13040 1726882406.68817: done getting the remaining hosts for this loop 13040 1726882406.68821: getting the next task for host managed_node1 13040 1726882406.68828: done getting next task for host managed_node1 13040 1726882406.68832: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13040 1726882406.68835: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882406.68849: getting variables 13040 1726882406.68851: in VariableManager get_vars() 13040 1726882406.68910: Calling all_inventory to load vars for managed_node1 13040 1726882406.68913: Calling groups_inventory to load vars for managed_node1 13040 1726882406.68916: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882406.68926: Calling all_plugins_play to load vars for managed_node1 13040 1726882406.68930: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882406.68933: Calling groups_plugins_play to load vars for managed_node1 13040 1726882406.69172: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000035 13040 1726882406.69175: WORKER PROCESS EXITING 13040 1726882406.69187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882406.69396: done with get_vars() 13040 1726882406.69406: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:33:26 -0400 (0:00:00.073) 0:00:04.172 ****** 13040 1726882406.69483: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 13040 1726882406.69484: Creating lock for fedora.linux_system_roles.network_connections 13040 1726882406.69674: worker is 1 (out of 1 available) 13040 1726882406.69688: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 13040 1726882406.69699: done queuing things up, now waiting for results queue to drain 13040 1726882406.69701: waiting for pending results... 13040 1726882406.69886: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13040 1726882406.69983: in run() - task 0e448fcc-3ce9-b123-314b-000000000036 13040 1726882406.70009: variable 'ansible_search_path' from source: unknown 13040 1726882406.70017: variable 'ansible_search_path' from source: unknown 13040 1726882406.70056: calling self._execute() 13040 1726882406.70145: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882406.70156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882406.70174: variable 'omit' from source: magic vars 13040 1726882406.70628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882406.72841: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882406.72896: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882406.72925: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882406.72950: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882406.72982: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882406.73039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882406.73082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882406.73112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882406.73158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882406.73181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882406.73318: variable 'ansible_distribution' from source: facts 13040 1726882406.73328: variable 'ansible_distribution_major_version' from source: facts 13040 1726882406.73349: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882406.73360: when evaluation is False, skipping this task 13040 1726882406.73370: _execute() done 13040 1726882406.73377: dumping result to json 13040 1726882406.73384: done dumping result, returning 13040 1726882406.73395: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-b123-314b-000000000036] 13040 1726882406.73405: sending task result for task 0e448fcc-3ce9-b123-314b-000000000036 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882406.73560: no more pending results, returning what we have 13040 1726882406.73566: results queue empty 13040 1726882406.73567: checking for any_errors_fatal 13040 1726882406.73573: done checking for any_errors_fatal 13040 1726882406.73574: checking for max_fail_percentage 13040 1726882406.73576: done checking for max_fail_percentage 13040 1726882406.73576: checking to see if all hosts have failed and the running result is not ok 13040 1726882406.73577: done checking to see if all hosts have failed 13040 1726882406.73578: getting the remaining hosts for this loop 13040 1726882406.73579: done getting the remaining hosts for this loop 13040 1726882406.73584: getting the next task for host managed_node1 13040 1726882406.73590: done getting next task for host managed_node1 13040 1726882406.73593: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 13040 1726882406.73596: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882406.73612: getting variables 13040 1726882406.73613: in VariableManager get_vars() 13040 1726882406.73668: Calling all_inventory to load vars for managed_node1 13040 1726882406.73671: Calling groups_inventory to load vars for managed_node1 13040 1726882406.73673: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882406.73684: Calling all_plugins_play to load vars for managed_node1 13040 1726882406.73686: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882406.73690: Calling groups_plugins_play to load vars for managed_node1 13040 1726882406.73847: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000036 13040 1726882406.73851: WORKER PROCESS EXITING 13040 1726882406.73882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882406.74107: done with get_vars() 13040 1726882406.74118: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:33:26 -0400 (0:00:00.047) 0:00:04.219 ****** 13040 1726882406.74212: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 13040 1726882406.74214: Creating lock for fedora.linux_system_roles.network_state 13040 1726882406.74493: worker is 1 (out of 1 available) 13040 1726882406.74504: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 13040 1726882406.74517: done queuing things up, now waiting for results queue to drain 13040 1726882406.74518: waiting for pending results... 13040 1726882406.74810: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 13040 1726882406.74949: in run() - task 0e448fcc-3ce9-b123-314b-000000000037 13040 1726882406.74985: variable 'ansible_search_path' from source: unknown 13040 1726882406.74992: variable 'ansible_search_path' from source: unknown 13040 1726882406.75032: calling self._execute() 13040 1726882406.75132: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882406.75144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882406.75158: variable 'omit' from source: magic vars 13040 1726882406.75635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882406.78468: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882406.78544: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882406.78600: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882406.78640: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882406.78680: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882406.78760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882406.78806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882406.78842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882406.78928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882406.78949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882406.79102: variable 'ansible_distribution' from source: facts 13040 1726882406.79119: variable 'ansible_distribution_major_version' from source: facts 13040 1726882406.79145: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882406.79152: when evaluation is False, skipping this task 13040 1726882406.79159: _execute() done 13040 1726882406.79167: dumping result to json 13040 1726882406.79175: done dumping result, returning 13040 1726882406.79188: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-b123-314b-000000000037] 13040 1726882406.79198: sending task result for task 0e448fcc-3ce9-b123-314b-000000000037 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882406.79359: no more pending results, returning what we have 13040 1726882406.79365: results queue empty 13040 1726882406.79366: checking for any_errors_fatal 13040 1726882406.79376: done checking for any_errors_fatal 13040 1726882406.79377: checking for max_fail_percentage 13040 1726882406.79378: done checking for max_fail_percentage 13040 1726882406.79379: checking to see if all hosts have failed and the running result is not ok 13040 1726882406.79380: done checking to see if all hosts have failed 13040 1726882406.79381: getting the remaining hosts for this loop 13040 1726882406.79383: done getting the remaining hosts for this loop 13040 1726882406.79386: getting the next task for host managed_node1 13040 1726882406.79393: done getting next task for host managed_node1 13040 1726882406.79397: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13040 1726882406.79400: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882406.79415: getting variables 13040 1726882406.79417: in VariableManager get_vars() 13040 1726882406.79475: Calling all_inventory to load vars for managed_node1 13040 1726882406.79479: Calling groups_inventory to load vars for managed_node1 13040 1726882406.79481: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882406.79492: Calling all_plugins_play to load vars for managed_node1 13040 1726882406.79495: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882406.79498: Calling groups_plugins_play to load vars for managed_node1 13040 1726882406.79737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882406.79965: done with get_vars() 13040 1726882406.79977: done getting variables 13040 1726882406.80126: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000037 13040 1726882406.80129: WORKER PROCESS EXITING 13040 1726882406.80166: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:33:26 -0400 (0:00:00.059) 0:00:04.279 ****** 13040 1726882406.80202: entering _queue_task() for managed_node1/debug 13040 1726882406.80855: worker is 1 (out of 1 available) 13040 1726882406.80869: exiting _queue_task() for managed_node1/debug 13040 1726882406.80891: done queuing things up, now waiting for results queue to drain 13040 1726882406.80893: waiting for pending results... 13040 1726882406.81180: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13040 1726882406.81330: in run() - task 0e448fcc-3ce9-b123-314b-000000000038 13040 1726882406.81353: variable 'ansible_search_path' from source: unknown 13040 1726882406.81362: variable 'ansible_search_path' from source: unknown 13040 1726882406.81405: calling self._execute() 13040 1726882406.81497: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882406.81508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882406.81522: variable 'omit' from source: magic vars 13040 1726882406.82017: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882406.84635: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882406.84737: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882406.84781: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882406.84827: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882406.84863: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882406.84958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882406.84994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882406.85034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882406.85087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882406.85107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882406.85275: variable 'ansible_distribution' from source: facts 13040 1726882406.85286: variable 'ansible_distribution_major_version' from source: facts 13040 1726882406.85309: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882406.85316: when evaluation is False, skipping this task 13040 1726882406.85323: _execute() done 13040 1726882406.85329: dumping result to json 13040 1726882406.85341: done dumping result, returning 13040 1726882406.85358: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-b123-314b-000000000038] 13040 1726882406.85372: sending task result for task 0e448fcc-3ce9-b123-314b-000000000038 skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 13040 1726882406.85531: no more pending results, returning what we have 13040 1726882406.85536: results queue empty 13040 1726882406.85537: checking for any_errors_fatal 13040 1726882406.85542: done checking for any_errors_fatal 13040 1726882406.85543: checking for max_fail_percentage 13040 1726882406.85544: done checking for max_fail_percentage 13040 1726882406.85545: checking to see if all hosts have failed and the running result is not ok 13040 1726882406.85546: done checking to see if all hosts have failed 13040 1726882406.85547: getting the remaining hosts for this loop 13040 1726882406.85548: done getting the remaining hosts for this loop 13040 1726882406.85552: getting the next task for host managed_node1 13040 1726882406.85559: done getting next task for host managed_node1 13040 1726882406.85565: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13040 1726882406.85568: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882406.85583: getting variables 13040 1726882406.85585: in VariableManager get_vars() 13040 1726882406.85643: Calling all_inventory to load vars for managed_node1 13040 1726882406.85646: Calling groups_inventory to load vars for managed_node1 13040 1726882406.85672: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882406.85684: Calling all_plugins_play to load vars for managed_node1 13040 1726882406.85687: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882406.85690: Calling groups_plugins_play to load vars for managed_node1 13040 1726882406.85876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882406.86110: done with get_vars() 13040 1726882406.86123: done getting variables 13040 1726882406.86185: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13040 1726882406.86329: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000038 13040 1726882406.86333: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:33:26 -0400 (0:00:00.061) 0:00:04.340 ****** 13040 1726882406.86349: entering _queue_task() for managed_node1/debug 13040 1726882406.86808: worker is 1 (out of 1 available) 13040 1726882406.86938: exiting _queue_task() for managed_node1/debug 13040 1726882406.86951: done queuing things up, now waiting for results queue to drain 13040 1726882406.86952: waiting for pending results... 13040 1726882406.87226: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13040 1726882406.87366: in run() - task 0e448fcc-3ce9-b123-314b-000000000039 13040 1726882406.87392: variable 'ansible_search_path' from source: unknown 13040 1726882406.87402: variable 'ansible_search_path' from source: unknown 13040 1726882406.87444: calling self._execute() 13040 1726882406.87541: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882406.87552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882406.87569: variable 'omit' from source: magic vars 13040 1726882406.88016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882406.90644: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882406.90721: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882406.90772: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882406.90810: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882406.90841: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882406.90926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882406.90952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882406.90983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882406.91024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882406.91038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882406.91182: variable 'ansible_distribution' from source: facts 13040 1726882406.91187: variable 'ansible_distribution_major_version' from source: facts 13040 1726882406.91205: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882406.91208: when evaluation is False, skipping this task 13040 1726882406.91211: _execute() done 13040 1726882406.91213: dumping result to json 13040 1726882406.91215: done dumping result, returning 13040 1726882406.91224: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-b123-314b-000000000039] 13040 1726882406.91230: sending task result for task 0e448fcc-3ce9-b123-314b-000000000039 13040 1726882406.91318: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000039 13040 1726882406.91321: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 13040 1726882406.91369: no more pending results, returning what we have 13040 1726882406.91373: results queue empty 13040 1726882406.91374: checking for any_errors_fatal 13040 1726882406.91381: done checking for any_errors_fatal 13040 1726882406.91382: checking for max_fail_percentage 13040 1726882406.91383: done checking for max_fail_percentage 13040 1726882406.91384: checking to see if all hosts have failed and the running result is not ok 13040 1726882406.91385: done checking to see if all hosts have failed 13040 1726882406.91386: getting the remaining hosts for this loop 13040 1726882406.91387: done getting the remaining hosts for this loop 13040 1726882406.91391: getting the next task for host managed_node1 13040 1726882406.91397: done getting next task for host managed_node1 13040 1726882406.91400: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13040 1726882406.91403: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882406.91414: getting variables 13040 1726882406.91416: in VariableManager get_vars() 13040 1726882406.91470: Calling all_inventory to load vars for managed_node1 13040 1726882406.91472: Calling groups_inventory to load vars for managed_node1 13040 1726882406.91475: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882406.91483: Calling all_plugins_play to load vars for managed_node1 13040 1726882406.91485: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882406.91488: Calling groups_plugins_play to load vars for managed_node1 13040 1726882406.91749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882406.91960: done with get_vars() 13040 1726882406.91972: done getting variables 13040 1726882406.92034: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:33:26 -0400 (0:00:00.057) 0:00:04.397 ****** 13040 1726882406.92067: entering _queue_task() for managed_node1/debug 13040 1726882406.92322: worker is 1 (out of 1 available) 13040 1726882406.92332: exiting _queue_task() for managed_node1/debug 13040 1726882406.92344: done queuing things up, now waiting for results queue to drain 13040 1726882406.92345: waiting for pending results... 13040 1726882406.93057: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13040 1726882406.93212: in run() - task 0e448fcc-3ce9-b123-314b-00000000003a 13040 1726882406.93234: variable 'ansible_search_path' from source: unknown 13040 1726882406.93246: variable 'ansible_search_path' from source: unknown 13040 1726882406.93291: calling self._execute() 13040 1726882406.93385: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882406.93401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882406.93415: variable 'omit' from source: magic vars 13040 1726882406.93874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882406.96321: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882406.96409: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882406.96458: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882406.96502: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882406.96536: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882406.96621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882406.96656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882406.96693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882406.96737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882406.96756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882406.96903: variable 'ansible_distribution' from source: facts 13040 1726882406.96916: variable 'ansible_distribution_major_version' from source: facts 13040 1726882406.96940: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882406.96948: when evaluation is False, skipping this task 13040 1726882406.96955: _execute() done 13040 1726882406.96962: dumping result to json 13040 1726882406.96971: done dumping result, returning 13040 1726882406.96984: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-b123-314b-00000000003a] 13040 1726882406.97000: sending task result for task 0e448fcc-3ce9-b123-314b-00000000003a 13040 1726882406.97118: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000003a skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 13040 1726882406.97169: no more pending results, returning what we have 13040 1726882406.97173: results queue empty 13040 1726882406.97174: checking for any_errors_fatal 13040 1726882406.97180: done checking for any_errors_fatal 13040 1726882406.97181: checking for max_fail_percentage 13040 1726882406.97183: done checking for max_fail_percentage 13040 1726882406.97184: checking to see if all hosts have failed and the running result is not ok 13040 1726882406.97185: done checking to see if all hosts have failed 13040 1726882406.97185: getting the remaining hosts for this loop 13040 1726882406.97187: done getting the remaining hosts for this loop 13040 1726882406.97191: getting the next task for host managed_node1 13040 1726882406.97198: done getting next task for host managed_node1 13040 1726882406.97202: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 13040 1726882406.97205: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882406.97221: getting variables 13040 1726882406.97223: in VariableManager get_vars() 13040 1726882406.97285: Calling all_inventory to load vars for managed_node1 13040 1726882406.97289: Calling groups_inventory to load vars for managed_node1 13040 1726882406.97291: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882406.97302: Calling all_plugins_play to load vars for managed_node1 13040 1726882406.97305: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882406.97308: Calling groups_plugins_play to load vars for managed_node1 13040 1726882406.97490: WORKER PROCESS EXITING 13040 1726882406.97510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882406.97708: done with get_vars() 13040 1726882406.97717: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:33:26 -0400 (0:00:00.057) 0:00:04.455 ****** 13040 1726882406.97784: entering _queue_task() for managed_node1/ping 13040 1726882406.97785: Creating lock for ping 13040 1726882406.97980: worker is 1 (out of 1 available) 13040 1726882406.97993: exiting _queue_task() for managed_node1/ping 13040 1726882406.98006: done queuing things up, now waiting for results queue to drain 13040 1726882406.98007: waiting for pending results... 13040 1726882406.98179: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 13040 1726882406.98259: in run() - task 0e448fcc-3ce9-b123-314b-00000000003b 13040 1726882406.98273: variable 'ansible_search_path' from source: unknown 13040 1726882406.98277: variable 'ansible_search_path' from source: unknown 13040 1726882406.98303: calling self._execute() 13040 1726882406.98367: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882406.98371: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882406.98379: variable 'omit' from source: magic vars 13040 1726882406.98686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882407.01210: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882407.01290: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882407.01331: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882407.01375: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882407.01405: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882407.01487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882407.01520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882407.01551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882407.01602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882407.01621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882407.01927: variable 'ansible_distribution' from source: facts 13040 1726882407.01932: variable 'ansible_distribution_major_version' from source: facts 13040 1726882407.01952: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882407.01957: when evaluation is False, skipping this task 13040 1726882407.01960: _execute() done 13040 1726882407.01963: dumping result to json 13040 1726882407.01983: done dumping result, returning 13040 1726882407.01986: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-b123-314b-00000000003b] 13040 1726882407.01999: sending task result for task 0e448fcc-3ce9-b123-314b-00000000003b 13040 1726882407.02111: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000003b 13040 1726882407.02114: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882407.02161: no more pending results, returning what we have 13040 1726882407.02168: results queue empty 13040 1726882407.02169: checking for any_errors_fatal 13040 1726882407.02176: done checking for any_errors_fatal 13040 1726882407.02177: checking for max_fail_percentage 13040 1726882407.02178: done checking for max_fail_percentage 13040 1726882407.02179: checking to see if all hosts have failed and the running result is not ok 13040 1726882407.02180: done checking to see if all hosts have failed 13040 1726882407.02181: getting the remaining hosts for this loop 13040 1726882407.02182: done getting the remaining hosts for this loop 13040 1726882407.02186: getting the next task for host managed_node1 13040 1726882407.02194: done getting next task for host managed_node1 13040 1726882407.02196: ^ task is: TASK: meta (role_complete) 13040 1726882407.02198: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882407.02212: getting variables 13040 1726882407.02214: in VariableManager get_vars() 13040 1726882407.02286: Calling all_inventory to load vars for managed_node1 13040 1726882407.02290: Calling groups_inventory to load vars for managed_node1 13040 1726882407.02292: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882407.02300: Calling all_plugins_play to load vars for managed_node1 13040 1726882407.02302: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882407.02304: Calling groups_plugins_play to load vars for managed_node1 13040 1726882407.02521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882407.02751: done with get_vars() 13040 1726882407.02771: done getting variables 13040 1726882407.02865: done queuing things up, now waiting for results queue to drain 13040 1726882407.02867: results queue empty 13040 1726882407.02868: checking for any_errors_fatal 13040 1726882407.02869: done checking for any_errors_fatal 13040 1726882407.02870: checking for max_fail_percentage 13040 1726882407.02870: done checking for max_fail_percentage 13040 1726882407.02871: checking to see if all hosts have failed and the running result is not ok 13040 1726882407.02871: done checking to see if all hosts have failed 13040 1726882407.02872: getting the remaining hosts for this loop 13040 1726882407.02872: done getting the remaining hosts for this loop 13040 1726882407.02874: getting the next task for host managed_node1 13040 1726882407.02877: done getting next task for host managed_node1 13040 1726882407.02879: ^ task is: TASK: Include the task 'get_interface_stat.yml' 13040 1726882407.02880: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882407.02881: getting variables 13040 1726882407.02882: in VariableManager get_vars() 13040 1726882407.02895: Calling all_inventory to load vars for managed_node1 13040 1726882407.02896: Calling groups_inventory to load vars for managed_node1 13040 1726882407.02897: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882407.02901: Calling all_plugins_play to load vars for managed_node1 13040 1726882407.02906: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882407.02908: Calling groups_plugins_play to load vars for managed_node1 13040 1726882407.03009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882407.03122: done with get_vars() 13040 1726882407.03133: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:33:27 -0400 (0:00:00.054) 0:00:04.509 ****** 13040 1726882407.03190: entering _queue_task() for managed_node1/include_tasks 13040 1726882407.03384: worker is 1 (out of 1 available) 13040 1726882407.03397: exiting _queue_task() for managed_node1/include_tasks 13040 1726882407.03409: done queuing things up, now waiting for results queue to drain 13040 1726882407.03410: waiting for pending results... 13040 1726882407.03588: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 13040 1726882407.03662: in run() - task 0e448fcc-3ce9-b123-314b-00000000006e 13040 1726882407.03675: variable 'ansible_search_path' from source: unknown 13040 1726882407.03678: variable 'ansible_search_path' from source: unknown 13040 1726882407.03711: calling self._execute() 13040 1726882407.03774: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882407.03784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882407.03804: variable 'omit' from source: magic vars 13040 1726882407.04217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882407.07091: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882407.07229: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882407.07277: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882407.07316: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882407.07361: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882407.07446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882407.07499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882407.07543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882407.07607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882407.07644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882407.07815: variable 'ansible_distribution' from source: facts 13040 1726882407.07825: variable 'ansible_distribution_major_version' from source: facts 13040 1726882407.07846: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882407.07853: when evaluation is False, skipping this task 13040 1726882407.07861: _execute() done 13040 1726882407.07871: dumping result to json 13040 1726882407.07883: done dumping result, returning 13040 1726882407.07905: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-b123-314b-00000000006e] 13040 1726882407.07916: sending task result for task 0e448fcc-3ce9-b123-314b-00000000006e skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882407.08072: no more pending results, returning what we have 13040 1726882407.08077: results queue empty 13040 1726882407.08078: checking for any_errors_fatal 13040 1726882407.08079: done checking for any_errors_fatal 13040 1726882407.08080: checking for max_fail_percentage 13040 1726882407.08082: done checking for max_fail_percentage 13040 1726882407.08083: checking to see if all hosts have failed and the running result is not ok 13040 1726882407.08084: done checking to see if all hosts have failed 13040 1726882407.08085: getting the remaining hosts for this loop 13040 1726882407.08086: done getting the remaining hosts for this loop 13040 1726882407.08091: getting the next task for host managed_node1 13040 1726882407.08098: done getting next task for host managed_node1 13040 1726882407.08101: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 13040 1726882407.08104: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882407.08108: getting variables 13040 1726882407.08111: in VariableManager get_vars() 13040 1726882407.08172: Calling all_inventory to load vars for managed_node1 13040 1726882407.08175: Calling groups_inventory to load vars for managed_node1 13040 1726882407.08177: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882407.08188: Calling all_plugins_play to load vars for managed_node1 13040 1726882407.08190: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882407.08195: Calling groups_plugins_play to load vars for managed_node1 13040 1726882407.08371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882407.08550: done with get_vars() 13040 1726882407.08557: done getting variables 13040 1726882407.08602: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000006e 13040 1726882407.08605: WORKER PROCESS EXITING 13040 1726882407.08617: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13040 1726882407.08709: variable 'interface' from source: task vars 13040 1726882407.08712: variable 'controller_device' from source: play vars 13040 1726882407.08752: variable 'controller_device' from source: play vars TASK [Assert that the interface is present - 'nm-bond'] ************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:33:27 -0400 (0:00:00.055) 0:00:04.564 ****** 13040 1726882407.08776: entering _queue_task() for managed_node1/assert 13040 1726882407.08959: worker is 1 (out of 1 available) 13040 1726882407.08974: exiting _queue_task() for managed_node1/assert 13040 1726882407.08986: done queuing things up, now waiting for results queue to drain 13040 1726882407.08987: waiting for pending results... 13040 1726882407.09151: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'nm-bond' 13040 1726882407.09224: in run() - task 0e448fcc-3ce9-b123-314b-00000000006f 13040 1726882407.09235: variable 'ansible_search_path' from source: unknown 13040 1726882407.09239: variable 'ansible_search_path' from source: unknown 13040 1726882407.09273: calling self._execute() 13040 1726882407.09331: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882407.09335: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882407.09343: variable 'omit' from source: magic vars 13040 1726882407.09755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882407.12104: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882407.12157: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882407.12185: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882407.12210: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882407.12229: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882407.12287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882407.12308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882407.12325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882407.12350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882407.12361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882407.12456: variable 'ansible_distribution' from source: facts 13040 1726882407.12460: variable 'ansible_distribution_major_version' from source: facts 13040 1726882407.12474: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882407.12478: when evaluation is False, skipping this task 13040 1726882407.12481: _execute() done 13040 1726882407.12483: dumping result to json 13040 1726882407.12485: done dumping result, returning 13040 1726882407.12492: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'nm-bond' [0e448fcc-3ce9-b123-314b-00000000006f] 13040 1726882407.12498: sending task result for task 0e448fcc-3ce9-b123-314b-00000000006f 13040 1726882407.12586: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000006f 13040 1726882407.12589: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882407.12642: no more pending results, returning what we have 13040 1726882407.12646: results queue empty 13040 1726882407.12646: checking for any_errors_fatal 13040 1726882407.12655: done checking for any_errors_fatal 13040 1726882407.12656: checking for max_fail_percentage 13040 1726882407.12658: done checking for max_fail_percentage 13040 1726882407.12658: checking to see if all hosts have failed and the running result is not ok 13040 1726882407.12659: done checking to see if all hosts have failed 13040 1726882407.12660: getting the remaining hosts for this loop 13040 1726882407.12662: done getting the remaining hosts for this loop 13040 1726882407.12667: getting the next task for host managed_node1 13040 1726882407.12675: done getting next task for host managed_node1 13040 1726882407.12678: ^ task is: TASK: Include the task 'assert_profile_present.yml' 13040 1726882407.12680: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882407.12683: getting variables 13040 1726882407.12684: in VariableManager get_vars() 13040 1726882407.12729: Calling all_inventory to load vars for managed_node1 13040 1726882407.12732: Calling groups_inventory to load vars for managed_node1 13040 1726882407.12734: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882407.12742: Calling all_plugins_play to load vars for managed_node1 13040 1726882407.12744: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882407.12747: Calling groups_plugins_play to load vars for managed_node1 13040 1726882407.12859: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882407.12980: done with get_vars() 13040 1726882407.12987: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:67 Friday 20 September 2024 21:33:27 -0400 (0:00:00.042) 0:00:04.607 ****** 13040 1726882407.13046: entering _queue_task() for managed_node1/include_tasks 13040 1726882407.13224: worker is 1 (out of 1 available) 13040 1726882407.13237: exiting _queue_task() for managed_node1/include_tasks 13040 1726882407.13249: done queuing things up, now waiting for results queue to drain 13040 1726882407.13250: waiting for pending results... 13040 1726882407.13410: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_present.yml' 13040 1726882407.13472: in run() - task 0e448fcc-3ce9-b123-314b-000000000070 13040 1726882407.13483: variable 'ansible_search_path' from source: unknown 13040 1726882407.13518: variable 'controller_profile' from source: play vars 13040 1726882407.13659: variable 'controller_profile' from source: play vars 13040 1726882407.13670: variable 'port1_profile' from source: play vars 13040 1726882407.13717: variable 'port1_profile' from source: play vars 13040 1726882407.13723: variable 'port2_profile' from source: play vars 13040 1726882407.13771: variable 'port2_profile' from source: play vars 13040 1726882407.13781: variable 'omit' from source: magic vars 13040 1726882407.13870: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882407.13878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882407.13887: variable 'omit' from source: magic vars 13040 1726882407.14156: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882407.15672: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882407.15716: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882407.15742: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882407.15771: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882407.15792: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882407.15855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882407.15879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882407.15897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882407.15925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882407.15936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882407.16026: variable 'ansible_distribution' from source: facts 13040 1726882407.16031: variable 'ansible_distribution_major_version' from source: facts 13040 1726882407.16044: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882407.16047: when evaluation is False, skipping this task 13040 1726882407.16070: variable 'item' from source: unknown 13040 1726882407.16116: variable 'item' from source: unknown skipping: [managed_node1] => (item=bond0) => { "ansible_loop_var": "item", "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "item": "bond0", "skip_reason": "Conditional result was False" } 13040 1726882407.16242: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882407.16245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882407.16248: variable 'omit' from source: magic vars 13040 1726882407.16352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882407.16375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882407.16393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882407.16418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882407.16428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882407.16499: variable 'ansible_distribution' from source: facts 13040 1726882407.16503: variable 'ansible_distribution_major_version' from source: facts 13040 1726882407.16509: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882407.16512: when evaluation is False, skipping this task 13040 1726882407.16528: variable 'item' from source: unknown 13040 1726882407.16577: variable 'item' from source: unknown skipping: [managed_node1] => (item=bond0.0) => { "ansible_loop_var": "item", "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "item": "bond0.0", "skip_reason": "Conditional result was False" } 13040 1726882407.16646: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882407.16649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882407.16652: variable 'omit' from source: magic vars 13040 1726882407.16770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882407.16792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882407.16808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882407.16832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882407.16842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882407.16912: variable 'ansible_distribution' from source: facts 13040 1726882407.16916: variable 'ansible_distribution_major_version' from source: facts 13040 1726882407.16923: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882407.16926: when evaluation is False, skipping this task 13040 1726882407.16941: variable 'item' from source: unknown 13040 1726882407.16986: variable 'item' from source: unknown skipping: [managed_node1] => (item=bond0.1) => { "ansible_loop_var": "item", "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "item": "bond0.1", "skip_reason": "Conditional result was False" } 13040 1726882407.17070: dumping result to json 13040 1726882407.17072: done dumping result, returning 13040 1726882407.17074: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_present.yml' [0e448fcc-3ce9-b123-314b-000000000070] 13040 1726882407.17076: sending task result for task 0e448fcc-3ce9-b123-314b-000000000070 13040 1726882407.17119: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000070 13040 1726882407.17121: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false } MSG: All items skipped 13040 1726882407.17154: no more pending results, returning what we have 13040 1726882407.17158: results queue empty 13040 1726882407.17159: checking for any_errors_fatal 13040 1726882407.17166: done checking for any_errors_fatal 13040 1726882407.17167: checking for max_fail_percentage 13040 1726882407.17168: done checking for max_fail_percentage 13040 1726882407.17169: checking to see if all hosts have failed and the running result is not ok 13040 1726882407.17170: done checking to see if all hosts have failed 13040 1726882407.17170: getting the remaining hosts for this loop 13040 1726882407.17172: done getting the remaining hosts for this loop 13040 1726882407.17175: getting the next task for host managed_node1 13040 1726882407.17180: done getting next task for host managed_node1 13040 1726882407.17183: ^ task is: TASK: ** TEST check polling interval 13040 1726882407.17185: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882407.17187: getting variables 13040 1726882407.17189: in VariableManager get_vars() 13040 1726882407.17244: Calling all_inventory to load vars for managed_node1 13040 1726882407.17247: Calling groups_inventory to load vars for managed_node1 13040 1726882407.17250: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882407.17258: Calling all_plugins_play to load vars for managed_node1 13040 1726882407.17261: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882407.17264: Calling groups_plugins_play to load vars for managed_node1 13040 1726882407.17417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882407.17533: done with get_vars() 13040 1726882407.17541: done getting variables 13040 1726882407.17584: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check polling interval] ****************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:75 Friday 20 September 2024 21:33:27 -0400 (0:00:00.045) 0:00:04.653 ****** 13040 1726882407.17603: entering _queue_task() for managed_node1/command 13040 1726882407.17786: worker is 1 (out of 1 available) 13040 1726882407.17798: exiting _queue_task() for managed_node1/command 13040 1726882407.17810: done queuing things up, now waiting for results queue to drain 13040 1726882407.17812: waiting for pending results... 13040 1726882407.17978: running TaskExecutor() for managed_node1/TASK: ** TEST check polling interval 13040 1726882407.18033: in run() - task 0e448fcc-3ce9-b123-314b-000000000071 13040 1726882407.18042: variable 'ansible_search_path' from source: unknown 13040 1726882407.18073: calling self._execute() 13040 1726882407.18132: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882407.18135: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882407.18145: variable 'omit' from source: magic vars 13040 1726882407.18449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882407.20057: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882407.20106: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882407.20132: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882407.20161: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882407.20187: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882407.20245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882407.20278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882407.20297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882407.20323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882407.20334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882407.20433: variable 'ansible_distribution' from source: facts 13040 1726882407.20439: variable 'ansible_distribution_major_version' from source: facts 13040 1726882407.20453: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882407.20458: when evaluation is False, skipping this task 13040 1726882407.20461: _execute() done 13040 1726882407.20465: dumping result to json 13040 1726882407.20469: done dumping result, returning 13040 1726882407.20476: done running TaskExecutor() for managed_node1/TASK: ** TEST check polling interval [0e448fcc-3ce9-b123-314b-000000000071] 13040 1726882407.20481: sending task result for task 0e448fcc-3ce9-b123-314b-000000000071 13040 1726882407.20570: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000071 13040 1726882407.20572: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882407.20623: no more pending results, returning what we have 13040 1726882407.20626: results queue empty 13040 1726882407.20627: checking for any_errors_fatal 13040 1726882407.20635: done checking for any_errors_fatal 13040 1726882407.20636: checking for max_fail_percentage 13040 1726882407.20637: done checking for max_fail_percentage 13040 1726882407.20638: checking to see if all hosts have failed and the running result is not ok 13040 1726882407.20639: done checking to see if all hosts have failed 13040 1726882407.20639: getting the remaining hosts for this loop 13040 1726882407.20641: done getting the remaining hosts for this loop 13040 1726882407.20645: getting the next task for host managed_node1 13040 1726882407.20650: done getting next task for host managed_node1 13040 1726882407.20655: ^ task is: TASK: ** TEST check IPv4 13040 1726882407.20656: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882407.20660: getting variables 13040 1726882407.20662: in VariableManager get_vars() 13040 1726882407.20716: Calling all_inventory to load vars for managed_node1 13040 1726882407.20719: Calling groups_inventory to load vars for managed_node1 13040 1726882407.20722: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882407.20730: Calling all_plugins_play to load vars for managed_node1 13040 1726882407.20732: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882407.20735: Calling groups_plugins_play to load vars for managed_node1 13040 1726882407.20859: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882407.20980: done with get_vars() 13040 1726882407.20987: done getting variables 13040 1726882407.21028: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:80 Friday 20 September 2024 21:33:27 -0400 (0:00:00.034) 0:00:04.687 ****** 13040 1726882407.21048: entering _queue_task() for managed_node1/command 13040 1726882407.21234: worker is 1 (out of 1 available) 13040 1726882407.21244: exiting _queue_task() for managed_node1/command 13040 1726882407.21258: done queuing things up, now waiting for results queue to drain 13040 1726882407.21260: waiting for pending results... 13040 1726882407.21423: running TaskExecutor() for managed_node1/TASK: ** TEST check IPv4 13040 1726882407.21485: in run() - task 0e448fcc-3ce9-b123-314b-000000000072 13040 1726882407.21494: variable 'ansible_search_path' from source: unknown 13040 1726882407.21522: calling self._execute() 13040 1726882407.21580: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882407.21588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882407.21596: variable 'omit' from source: magic vars 13040 1726882407.21958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882407.23505: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882407.23550: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882407.23744: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882407.23778: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882407.23798: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882407.23853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882407.23879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882407.23897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882407.23922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882407.23932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882407.24027: variable 'ansible_distribution' from source: facts 13040 1726882407.24030: variable 'ansible_distribution_major_version' from source: facts 13040 1726882407.24045: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882407.24047: when evaluation is False, skipping this task 13040 1726882407.24050: _execute() done 13040 1726882407.24055: dumping result to json 13040 1726882407.24057: done dumping result, returning 13040 1726882407.24060: done running TaskExecutor() for managed_node1/TASK: ** TEST check IPv4 [0e448fcc-3ce9-b123-314b-000000000072] 13040 1726882407.24069: sending task result for task 0e448fcc-3ce9-b123-314b-000000000072 13040 1726882407.24146: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000072 13040 1726882407.24148: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882407.24204: no more pending results, returning what we have 13040 1726882407.24207: results queue empty 13040 1726882407.24208: checking for any_errors_fatal 13040 1726882407.24215: done checking for any_errors_fatal 13040 1726882407.24215: checking for max_fail_percentage 13040 1726882407.24217: done checking for max_fail_percentage 13040 1726882407.24218: checking to see if all hosts have failed and the running result is not ok 13040 1726882407.24219: done checking to see if all hosts have failed 13040 1726882407.24220: getting the remaining hosts for this loop 13040 1726882407.24221: done getting the remaining hosts for this loop 13040 1726882407.24224: getting the next task for host managed_node1 13040 1726882407.24229: done getting next task for host managed_node1 13040 1726882407.24233: ^ task is: TASK: ** TEST check IPv6 13040 1726882407.24234: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882407.24237: getting variables 13040 1726882407.24238: in VariableManager get_vars() 13040 1726882407.24328: Calling all_inventory to load vars for managed_node1 13040 1726882407.24331: Calling groups_inventory to load vars for managed_node1 13040 1726882407.24333: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882407.24340: Calling all_plugins_play to load vars for managed_node1 13040 1726882407.24342: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882407.24343: Calling groups_plugins_play to load vars for managed_node1 13040 1726882407.24492: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882407.24608: done with get_vars() 13040 1726882407.24615: done getting variables 13040 1726882407.24657: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:87 Friday 20 September 2024 21:33:27 -0400 (0:00:00.036) 0:00:04.724 ****** 13040 1726882407.24678: entering _queue_task() for managed_node1/command 13040 1726882407.24859: worker is 1 (out of 1 available) 13040 1726882407.24873: exiting _queue_task() for managed_node1/command 13040 1726882407.24885: done queuing things up, now waiting for results queue to drain 13040 1726882407.24887: waiting for pending results... 13040 1726882407.25043: running TaskExecutor() for managed_node1/TASK: ** TEST check IPv6 13040 1726882407.25101: in run() - task 0e448fcc-3ce9-b123-314b-000000000073 13040 1726882407.25111: variable 'ansible_search_path' from source: unknown 13040 1726882407.25142: calling self._execute() 13040 1726882407.25203: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882407.25207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882407.25216: variable 'omit' from source: magic vars 13040 1726882407.25614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882407.27912: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882407.27978: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882407.28018: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882407.28056: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882407.28089: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882407.28168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882407.28204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882407.28239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882407.28287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882407.28304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882407.28432: variable 'ansible_distribution' from source: facts 13040 1726882407.28444: variable 'ansible_distribution_major_version' from source: facts 13040 1726882407.28471: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882407.28478: when evaluation is False, skipping this task 13040 1726882407.28485: _execute() done 13040 1726882407.28490: dumping result to json 13040 1726882407.28496: done dumping result, returning 13040 1726882407.28516: done running TaskExecutor() for managed_node1/TASK: ** TEST check IPv6 [0e448fcc-3ce9-b123-314b-000000000073] 13040 1726882407.28532: sending task result for task 0e448fcc-3ce9-b123-314b-000000000073 13040 1726882407.28651: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000073 13040 1726882407.28660: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882407.28712: no more pending results, returning what we have 13040 1726882407.28716: results queue empty 13040 1726882407.28717: checking for any_errors_fatal 13040 1726882407.28722: done checking for any_errors_fatal 13040 1726882407.28723: checking for max_fail_percentage 13040 1726882407.28725: done checking for max_fail_percentage 13040 1726882407.28726: checking to see if all hosts have failed and the running result is not ok 13040 1726882407.28727: done checking to see if all hosts have failed 13040 1726882407.28728: getting the remaining hosts for this loop 13040 1726882407.28729: done getting the remaining hosts for this loop 13040 1726882407.28733: getting the next task for host managed_node1 13040 1726882407.28741: done getting next task for host managed_node1 13040 1726882407.28747: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13040 1726882407.28750: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882407.28768: getting variables 13040 1726882407.28771: in VariableManager get_vars() 13040 1726882407.28821: Calling all_inventory to load vars for managed_node1 13040 1726882407.28824: Calling groups_inventory to load vars for managed_node1 13040 1726882407.28827: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882407.28837: Calling all_plugins_play to load vars for managed_node1 13040 1726882407.28840: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882407.28843: Calling groups_plugins_play to load vars for managed_node1 13040 1726882407.29011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882407.29362: done with get_vars() 13040 1726882407.29375: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:33:27 -0400 (0:00:00.047) 0:00:04.771 ****** 13040 1726882407.29476: entering _queue_task() for managed_node1/include_tasks 13040 1726882407.30043: worker is 1 (out of 1 available) 13040 1726882407.30090: exiting _queue_task() for managed_node1/include_tasks 13040 1726882407.30100: done queuing things up, now waiting for results queue to drain 13040 1726882407.30102: waiting for pending results... 13040 1726882407.30398: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13040 1726882407.30537: in run() - task 0e448fcc-3ce9-b123-314b-00000000007b 13040 1726882407.30563: variable 'ansible_search_path' from source: unknown 13040 1726882407.30573: variable 'ansible_search_path' from source: unknown 13040 1726882407.30610: calling self._execute() 13040 1726882407.30697: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882407.30708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882407.30720: variable 'omit' from source: magic vars 13040 1726882407.31337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882407.33640: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882407.33741: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882407.33786: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882407.33827: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882407.33880: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882407.33967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882407.34002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882407.34031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882407.34079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882407.34098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882407.34249: variable 'ansible_distribution' from source: facts 13040 1726882407.34262: variable 'ansible_distribution_major_version' from source: facts 13040 1726882407.34337: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882407.34344: when evaluation is False, skipping this task 13040 1726882407.34350: _execute() done 13040 1726882407.34355: dumping result to json 13040 1726882407.34362: done dumping result, returning 13040 1726882407.34375: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-b123-314b-00000000007b] 13040 1726882407.34385: sending task result for task 0e448fcc-3ce9-b123-314b-00000000007b 13040 1726882407.34508: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000007b 13040 1726882407.34511: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882407.34570: no more pending results, returning what we have 13040 1726882407.34574: results queue empty 13040 1726882407.34575: checking for any_errors_fatal 13040 1726882407.34581: done checking for any_errors_fatal 13040 1726882407.34581: checking for max_fail_percentage 13040 1726882407.34583: done checking for max_fail_percentage 13040 1726882407.34584: checking to see if all hosts have failed and the running result is not ok 13040 1726882407.34585: done checking to see if all hosts have failed 13040 1726882407.34586: getting the remaining hosts for this loop 13040 1726882407.34587: done getting the remaining hosts for this loop 13040 1726882407.34591: getting the next task for host managed_node1 13040 1726882407.34597: done getting next task for host managed_node1 13040 1726882407.34601: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 13040 1726882407.34604: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882407.34619: getting variables 13040 1726882407.34621: in VariableManager get_vars() 13040 1726882407.34673: Calling all_inventory to load vars for managed_node1 13040 1726882407.34676: Calling groups_inventory to load vars for managed_node1 13040 1726882407.34678: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882407.34688: Calling all_plugins_play to load vars for managed_node1 13040 1726882407.34707: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882407.34713: Calling groups_plugins_play to load vars for managed_node1 13040 1726882407.34876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882407.34998: done with get_vars() 13040 1726882407.35006: done getting variables 13040 1726882407.35049: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:33:27 -0400 (0:00:00.055) 0:00:04.827 ****** 13040 1726882407.35075: entering _queue_task() for managed_node1/debug 13040 1726882407.35259: worker is 1 (out of 1 available) 13040 1726882407.35274: exiting _queue_task() for managed_node1/debug 13040 1726882407.35286: done queuing things up, now waiting for results queue to drain 13040 1726882407.35287: waiting for pending results... 13040 1726882407.35462: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 13040 1726882407.35591: in run() - task 0e448fcc-3ce9-b123-314b-00000000007c 13040 1726882407.35611: variable 'ansible_search_path' from source: unknown 13040 1726882407.35619: variable 'ansible_search_path' from source: unknown 13040 1726882407.35660: calling self._execute() 13040 1726882407.35739: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882407.35751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882407.35770: variable 'omit' from source: magic vars 13040 1726882407.36186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882407.40150: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882407.40254: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882407.40300: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882407.40337: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882407.40373: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882407.40463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882407.40501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882407.40533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882407.40585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882407.40606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882407.40745: variable 'ansible_distribution' from source: facts 13040 1726882407.40760: variable 'ansible_distribution_major_version' from source: facts 13040 1726882407.40785: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882407.40793: when evaluation is False, skipping this task 13040 1726882407.40800: _execute() done 13040 1726882407.40806: dumping result to json 13040 1726882407.40813: done dumping result, returning 13040 1726882407.40825: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-b123-314b-00000000007c] 13040 1726882407.40834: sending task result for task 0e448fcc-3ce9-b123-314b-00000000007c skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 13040 1726882407.40987: no more pending results, returning what we have 13040 1726882407.40991: results queue empty 13040 1726882407.40992: checking for any_errors_fatal 13040 1726882407.40997: done checking for any_errors_fatal 13040 1726882407.40998: checking for max_fail_percentage 13040 1726882407.41001: done checking for max_fail_percentage 13040 1726882407.41001: checking to see if all hosts have failed and the running result is not ok 13040 1726882407.41002: done checking to see if all hosts have failed 13040 1726882407.41003: getting the remaining hosts for this loop 13040 1726882407.41004: done getting the remaining hosts for this loop 13040 1726882407.41008: getting the next task for host managed_node1 13040 1726882407.41013: done getting next task for host managed_node1 13040 1726882407.41018: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13040 1726882407.41021: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882407.41037: getting variables 13040 1726882407.41039: in VariableManager get_vars() 13040 1726882407.41090: Calling all_inventory to load vars for managed_node1 13040 1726882407.41093: Calling groups_inventory to load vars for managed_node1 13040 1726882407.41095: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882407.41104: Calling all_plugins_play to load vars for managed_node1 13040 1726882407.41107: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882407.41109: Calling groups_plugins_play to load vars for managed_node1 13040 1726882407.41280: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000007c 13040 1726882407.41284: WORKER PROCESS EXITING 13040 1726882407.41290: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882407.41514: done with get_vars() 13040 1726882407.41525: done getting variables 13040 1726882407.41584: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:33:27 -0400 (0:00:00.065) 0:00:04.893 ****** 13040 1726882407.41621: entering _queue_task() for managed_node1/fail 13040 1726882407.41881: worker is 1 (out of 1 available) 13040 1726882407.41893: exiting _queue_task() for managed_node1/fail 13040 1726882407.41904: done queuing things up, now waiting for results queue to drain 13040 1726882407.41905: waiting for pending results... 13040 1726882407.42298: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13040 1726882407.42445: in run() - task 0e448fcc-3ce9-b123-314b-00000000007d 13040 1726882407.42474: variable 'ansible_search_path' from source: unknown 13040 1726882407.42481: variable 'ansible_search_path' from source: unknown 13040 1726882407.42521: calling self._execute() 13040 1726882407.42609: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882407.42623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882407.42636: variable 'omit' from source: magic vars 13040 1726882407.43184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882407.46298: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882407.46374: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882407.46422: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882407.46477: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882407.46514: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882407.46596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882407.46629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882407.46656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882407.46716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882407.46736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882407.46889: variable 'ansible_distribution' from source: facts 13040 1726882407.46901: variable 'ansible_distribution_major_version' from source: facts 13040 1726882407.46932: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882407.46939: when evaluation is False, skipping this task 13040 1726882407.46945: _execute() done 13040 1726882407.46950: dumping result to json 13040 1726882407.46957: done dumping result, returning 13040 1726882407.46970: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-b123-314b-00000000007d] 13040 1726882407.46979: sending task result for task 0e448fcc-3ce9-b123-314b-00000000007d skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882407.47125: no more pending results, returning what we have 13040 1726882407.47129: results queue empty 13040 1726882407.47130: checking for any_errors_fatal 13040 1726882407.47138: done checking for any_errors_fatal 13040 1726882407.47138: checking for max_fail_percentage 13040 1726882407.47140: done checking for max_fail_percentage 13040 1726882407.47141: checking to see if all hosts have failed and the running result is not ok 13040 1726882407.47142: done checking to see if all hosts have failed 13040 1726882407.47143: getting the remaining hosts for this loop 13040 1726882407.47144: done getting the remaining hosts for this loop 13040 1726882407.47148: getting the next task for host managed_node1 13040 1726882407.47155: done getting next task for host managed_node1 13040 1726882407.47159: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13040 1726882407.47162: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882407.47188: getting variables 13040 1726882407.47190: in VariableManager get_vars() 13040 1726882407.47243: Calling all_inventory to load vars for managed_node1 13040 1726882407.47246: Calling groups_inventory to load vars for managed_node1 13040 1726882407.47308: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882407.47320: Calling all_plugins_play to load vars for managed_node1 13040 1726882407.47323: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882407.47326: Calling groups_plugins_play to load vars for managed_node1 13040 1726882407.47499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882407.47726: done with get_vars() 13040 1726882407.47737: done getting variables 13040 1726882407.47915: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13040 1726882407.47971: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000007d 13040 1726882407.47974: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:33:27 -0400 (0:00:00.063) 0:00:04.957 ****** 13040 1726882407.48003: entering _queue_task() for managed_node1/fail 13040 1726882407.48697: worker is 1 (out of 1 available) 13040 1726882407.48712: exiting _queue_task() for managed_node1/fail 13040 1726882407.48724: done queuing things up, now waiting for results queue to drain 13040 1726882407.48726: waiting for pending results... 13040 1726882407.49758: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13040 1726882407.49936: in run() - task 0e448fcc-3ce9-b123-314b-00000000007e 13040 1726882407.49955: variable 'ansible_search_path' from source: unknown 13040 1726882407.49963: variable 'ansible_search_path' from source: unknown 13040 1726882407.50003: calling self._execute() 13040 1726882407.50096: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882407.50117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882407.50132: variable 'omit' from source: magic vars 13040 1726882407.50588: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882407.53298: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882407.53366: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882407.53413: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882407.53449: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882407.53483: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882407.53565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882407.53599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882407.53635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882407.53681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882407.53699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882407.53841: variable 'ansible_distribution' from source: facts 13040 1726882407.53853: variable 'ansible_distribution_major_version' from source: facts 13040 1726882407.53880: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882407.53889: when evaluation is False, skipping this task 13040 1726882407.53895: _execute() done 13040 1726882407.53901: dumping result to json 13040 1726882407.53909: done dumping result, returning 13040 1726882407.53921: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-b123-314b-00000000007e] 13040 1726882407.53937: sending task result for task 0e448fcc-3ce9-b123-314b-00000000007e skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882407.54095: no more pending results, returning what we have 13040 1726882407.54099: results queue empty 13040 1726882407.54100: checking for any_errors_fatal 13040 1726882407.54108: done checking for any_errors_fatal 13040 1726882407.54109: checking for max_fail_percentage 13040 1726882407.54111: done checking for max_fail_percentage 13040 1726882407.54112: checking to see if all hosts have failed and the running result is not ok 13040 1726882407.54112: done checking to see if all hosts have failed 13040 1726882407.54114: getting the remaining hosts for this loop 13040 1726882407.54115: done getting the remaining hosts for this loop 13040 1726882407.54120: getting the next task for host managed_node1 13040 1726882407.54128: done getting next task for host managed_node1 13040 1726882407.54133: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13040 1726882407.54136: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882407.54156: getting variables 13040 1726882407.54158: in VariableManager get_vars() 13040 1726882407.54219: Calling all_inventory to load vars for managed_node1 13040 1726882407.54222: Calling groups_inventory to load vars for managed_node1 13040 1726882407.54225: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882407.54235: Calling all_plugins_play to load vars for managed_node1 13040 1726882407.54238: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882407.54240: Calling groups_plugins_play to load vars for managed_node1 13040 1726882407.54414: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882407.54633: done with get_vars() 13040 1726882407.54645: done getting variables 13040 1726882407.54804: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000007e 13040 1726882407.54807: WORKER PROCESS EXITING 13040 1726882407.54835: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:33:27 -0400 (0:00:00.068) 0:00:05.025 ****** 13040 1726882407.54872: entering _queue_task() for managed_node1/fail 13040 1726882407.55301: worker is 1 (out of 1 available) 13040 1726882407.55323: exiting _queue_task() for managed_node1/fail 13040 1726882407.55335: done queuing things up, now waiting for results queue to drain 13040 1726882407.55337: waiting for pending results... 13040 1726882407.55513: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13040 1726882407.55612: in run() - task 0e448fcc-3ce9-b123-314b-00000000007f 13040 1726882407.55623: variable 'ansible_search_path' from source: unknown 13040 1726882407.55626: variable 'ansible_search_path' from source: unknown 13040 1726882407.55659: calling self._execute() 13040 1726882407.55966: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882407.55970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882407.55979: variable 'omit' from source: magic vars 13040 1726882407.56259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882407.57918: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882407.58001: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882407.58047: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882407.58093: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882407.58124: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882407.58207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882407.58241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882407.58287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882407.58338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882407.58360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882407.58487: variable 'ansible_distribution' from source: facts 13040 1726882407.58491: variable 'ansible_distribution_major_version' from source: facts 13040 1726882407.58505: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882407.58508: when evaluation is False, skipping this task 13040 1726882407.58510: _execute() done 13040 1726882407.58513: dumping result to json 13040 1726882407.58515: done dumping result, returning 13040 1726882407.58524: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-b123-314b-00000000007f] 13040 1726882407.58530: sending task result for task 0e448fcc-3ce9-b123-314b-00000000007f 13040 1726882407.58631: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000007f 13040 1726882407.58634: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882407.58698: no more pending results, returning what we have 13040 1726882407.58702: results queue empty 13040 1726882407.58703: checking for any_errors_fatal 13040 1726882407.58710: done checking for any_errors_fatal 13040 1726882407.58711: checking for max_fail_percentage 13040 1726882407.58712: done checking for max_fail_percentage 13040 1726882407.58713: checking to see if all hosts have failed and the running result is not ok 13040 1726882407.58714: done checking to see if all hosts have failed 13040 1726882407.58714: getting the remaining hosts for this loop 13040 1726882407.58716: done getting the remaining hosts for this loop 13040 1726882407.58719: getting the next task for host managed_node1 13040 1726882407.58725: done getting next task for host managed_node1 13040 1726882407.58729: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13040 1726882407.58732: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882407.58747: getting variables 13040 1726882407.58749: in VariableManager get_vars() 13040 1726882407.59172: Calling all_inventory to load vars for managed_node1 13040 1726882407.59175: Calling groups_inventory to load vars for managed_node1 13040 1726882407.59177: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882407.59186: Calling all_plugins_play to load vars for managed_node1 13040 1726882407.59189: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882407.59192: Calling groups_plugins_play to load vars for managed_node1 13040 1726882407.59337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882407.59537: done with get_vars() 13040 1726882407.59547: done getting variables 13040 1726882407.59603: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:33:27 -0400 (0:00:00.047) 0:00:05.073 ****** 13040 1726882407.59632: entering _queue_task() for managed_node1/dnf 13040 1726882407.59889: worker is 1 (out of 1 available) 13040 1726882407.59901: exiting _queue_task() for managed_node1/dnf 13040 1726882407.59932: done queuing things up, now waiting for results queue to drain 13040 1726882407.59934: waiting for pending results... 13040 1726882407.60115: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13040 1726882407.60207: in run() - task 0e448fcc-3ce9-b123-314b-000000000080 13040 1726882407.60217: variable 'ansible_search_path' from source: unknown 13040 1726882407.60220: variable 'ansible_search_path' from source: unknown 13040 1726882407.60252: calling self._execute() 13040 1726882407.60316: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882407.60321: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882407.60329: variable 'omit' from source: magic vars 13040 1726882407.60632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882407.62569: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882407.62642: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882407.62681: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882407.62722: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882407.62747: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882407.62808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882407.62827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882407.62846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882407.62882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882407.62892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882407.62997: variable 'ansible_distribution' from source: facts 13040 1726882407.63001: variable 'ansible_distribution_major_version' from source: facts 13040 1726882407.63016: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882407.63019: when evaluation is False, skipping this task 13040 1726882407.63021: _execute() done 13040 1726882407.63024: dumping result to json 13040 1726882407.63026: done dumping result, returning 13040 1726882407.63033: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-b123-314b-000000000080] 13040 1726882407.63038: sending task result for task 0e448fcc-3ce9-b123-314b-000000000080 13040 1726882407.63163: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000080 13040 1726882407.63169: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882407.63214: no more pending results, returning what we have 13040 1726882407.63217: results queue empty 13040 1726882407.63218: checking for any_errors_fatal 13040 1726882407.63229: done checking for any_errors_fatal 13040 1726882407.63230: checking for max_fail_percentage 13040 1726882407.63231: done checking for max_fail_percentage 13040 1726882407.63232: checking to see if all hosts have failed and the running result is not ok 13040 1726882407.63233: done checking to see if all hosts have failed 13040 1726882407.63234: getting the remaining hosts for this loop 13040 1726882407.63235: done getting the remaining hosts for this loop 13040 1726882407.63238: getting the next task for host managed_node1 13040 1726882407.63244: done getting next task for host managed_node1 13040 1726882407.63249: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13040 1726882407.63251: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882407.63269: getting variables 13040 1726882407.63271: in VariableManager get_vars() 13040 1726882407.63318: Calling all_inventory to load vars for managed_node1 13040 1726882407.63321: Calling groups_inventory to load vars for managed_node1 13040 1726882407.63323: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882407.63331: Calling all_plugins_play to load vars for managed_node1 13040 1726882407.63333: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882407.63335: Calling groups_plugins_play to load vars for managed_node1 13040 1726882407.63448: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882407.63604: done with get_vars() 13040 1726882407.63611: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13040 1726882407.63661: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:33:27 -0400 (0:00:00.040) 0:00:05.114 ****** 13040 1726882407.63685: entering _queue_task() for managed_node1/yum 13040 1726882407.63868: worker is 1 (out of 1 available) 13040 1726882407.63880: exiting _queue_task() for managed_node1/yum 13040 1726882407.63891: done queuing things up, now waiting for results queue to drain 13040 1726882407.63892: waiting for pending results... 13040 1726882407.64062: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13040 1726882407.64148: in run() - task 0e448fcc-3ce9-b123-314b-000000000081 13040 1726882407.64161: variable 'ansible_search_path' from source: unknown 13040 1726882407.64166: variable 'ansible_search_path' from source: unknown 13040 1726882407.64200: calling self._execute() 13040 1726882407.64298: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882407.64308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882407.64320: variable 'omit' from source: magic vars 13040 1726882407.64798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882407.66685: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882407.66735: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882407.66764: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882407.66791: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882407.66811: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882407.66867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882407.66889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882407.66907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882407.66933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882407.66943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882407.67044: variable 'ansible_distribution' from source: facts 13040 1726882407.67048: variable 'ansible_distribution_major_version' from source: facts 13040 1726882407.67068: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882407.67071: when evaluation is False, skipping this task 13040 1726882407.67076: _execute() done 13040 1726882407.67078: dumping result to json 13040 1726882407.67081: done dumping result, returning 13040 1726882407.67084: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-b123-314b-000000000081] 13040 1726882407.67089: sending task result for task 0e448fcc-3ce9-b123-314b-000000000081 13040 1726882407.67180: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000081 13040 1726882407.67183: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882407.67251: no more pending results, returning what we have 13040 1726882407.67255: results queue empty 13040 1726882407.67256: checking for any_errors_fatal 13040 1726882407.67267: done checking for any_errors_fatal 13040 1726882407.67267: checking for max_fail_percentage 13040 1726882407.67269: done checking for max_fail_percentage 13040 1726882407.67270: checking to see if all hosts have failed and the running result is not ok 13040 1726882407.67271: done checking to see if all hosts have failed 13040 1726882407.67271: getting the remaining hosts for this loop 13040 1726882407.67272: done getting the remaining hosts for this loop 13040 1726882407.67276: getting the next task for host managed_node1 13040 1726882407.67281: done getting next task for host managed_node1 13040 1726882407.67286: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13040 1726882407.67288: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882407.67304: getting variables 13040 1726882407.67305: in VariableManager get_vars() 13040 1726882407.67352: Calling all_inventory to load vars for managed_node1 13040 1726882407.67355: Calling groups_inventory to load vars for managed_node1 13040 1726882407.67357: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882407.67367: Calling all_plugins_play to load vars for managed_node1 13040 1726882407.67369: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882407.67372: Calling groups_plugins_play to load vars for managed_node1 13040 1726882407.67532: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882407.67741: done with get_vars() 13040 1726882407.67751: done getting variables 13040 1726882407.67806: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:33:27 -0400 (0:00:00.041) 0:00:05.155 ****** 13040 1726882407.67837: entering _queue_task() for managed_node1/fail 13040 1726882407.68063: worker is 1 (out of 1 available) 13040 1726882407.68075: exiting _queue_task() for managed_node1/fail 13040 1726882407.68087: done queuing things up, now waiting for results queue to drain 13040 1726882407.68088: waiting for pending results... 13040 1726882407.68352: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13040 1726882407.68494: in run() - task 0e448fcc-3ce9-b123-314b-000000000082 13040 1726882407.68512: variable 'ansible_search_path' from source: unknown 13040 1726882407.68521: variable 'ansible_search_path' from source: unknown 13040 1726882407.68569: calling self._execute() 13040 1726882407.68654: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882407.68668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882407.68682: variable 'omit' from source: magic vars 13040 1726882407.69108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882407.71069: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882407.71114: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882407.71141: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882407.71169: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882407.71193: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882407.71251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882407.71279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882407.71301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882407.71326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882407.71336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882407.71439: variable 'ansible_distribution' from source: facts 13040 1726882407.71442: variable 'ansible_distribution_major_version' from source: facts 13040 1726882407.71460: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882407.71464: when evaluation is False, skipping this task 13040 1726882407.71468: _execute() done 13040 1726882407.71470: dumping result to json 13040 1726882407.71472: done dumping result, returning 13040 1726882407.71479: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-b123-314b-000000000082] 13040 1726882407.71485: sending task result for task 0e448fcc-3ce9-b123-314b-000000000082 13040 1726882407.71577: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000082 13040 1726882407.71580: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882407.71627: no more pending results, returning what we have 13040 1726882407.71631: results queue empty 13040 1726882407.71632: checking for any_errors_fatal 13040 1726882407.71638: done checking for any_errors_fatal 13040 1726882407.71639: checking for max_fail_percentage 13040 1726882407.71640: done checking for max_fail_percentage 13040 1726882407.71641: checking to see if all hosts have failed and the running result is not ok 13040 1726882407.71642: done checking to see if all hosts have failed 13040 1726882407.71642: getting the remaining hosts for this loop 13040 1726882407.71644: done getting the remaining hosts for this loop 13040 1726882407.71647: getting the next task for host managed_node1 13040 1726882407.71655: done getting next task for host managed_node1 13040 1726882407.71659: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 13040 1726882407.71662: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882407.71681: getting variables 13040 1726882407.71682: in VariableManager get_vars() 13040 1726882407.71729: Calling all_inventory to load vars for managed_node1 13040 1726882407.71731: Calling groups_inventory to load vars for managed_node1 13040 1726882407.71733: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882407.71742: Calling all_plugins_play to load vars for managed_node1 13040 1726882407.71744: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882407.71746: Calling groups_plugins_play to load vars for managed_node1 13040 1726882407.71961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882407.72210: done with get_vars() 13040 1726882407.72221: done getting variables 13040 1726882407.72288: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:33:27 -0400 (0:00:00.044) 0:00:05.200 ****** 13040 1726882407.72330: entering _queue_task() for managed_node1/package 13040 1726882407.72606: worker is 1 (out of 1 available) 13040 1726882407.72619: exiting _queue_task() for managed_node1/package 13040 1726882407.72631: done queuing things up, now waiting for results queue to drain 13040 1726882407.72632: waiting for pending results... 13040 1726882407.73035: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 13040 1726882407.73149: in run() - task 0e448fcc-3ce9-b123-314b-000000000083 13040 1726882407.73160: variable 'ansible_search_path' from source: unknown 13040 1726882407.73166: variable 'ansible_search_path' from source: unknown 13040 1726882407.73196: calling self._execute() 13040 1726882407.73268: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882407.73272: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882407.73281: variable 'omit' from source: magic vars 13040 1726882407.73585: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882407.75381: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882407.75488: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882407.75533: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882407.75581: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882407.75623: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882407.75712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882407.75748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882407.75783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882407.75839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882407.75866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882407.76029: variable 'ansible_distribution' from source: facts 13040 1726882407.76045: variable 'ansible_distribution_major_version' from source: facts 13040 1726882407.76071: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882407.76078: when evaluation is False, skipping this task 13040 1726882407.76084: _execute() done 13040 1726882407.76090: dumping result to json 13040 1726882407.76096: done dumping result, returning 13040 1726882407.76107: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-b123-314b-000000000083] 13040 1726882407.76116: sending task result for task 0e448fcc-3ce9-b123-314b-000000000083 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882407.76298: no more pending results, returning what we have 13040 1726882407.76302: results queue empty 13040 1726882407.76303: checking for any_errors_fatal 13040 1726882407.76311: done checking for any_errors_fatal 13040 1726882407.76312: checking for max_fail_percentage 13040 1726882407.76314: done checking for max_fail_percentage 13040 1726882407.76315: checking to see if all hosts have failed and the running result is not ok 13040 1726882407.76316: done checking to see if all hosts have failed 13040 1726882407.76317: getting the remaining hosts for this loop 13040 1726882407.76318: done getting the remaining hosts for this loop 13040 1726882407.76322: getting the next task for host managed_node1 13040 1726882407.76329: done getting next task for host managed_node1 13040 1726882407.76333: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13040 1726882407.76336: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882407.76360: getting variables 13040 1726882407.76362: in VariableManager get_vars() 13040 1726882407.76427: Calling all_inventory to load vars for managed_node1 13040 1726882407.76431: Calling groups_inventory to load vars for managed_node1 13040 1726882407.76434: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882407.76445: Calling all_plugins_play to load vars for managed_node1 13040 1726882407.76448: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882407.76454: Calling groups_plugins_play to load vars for managed_node1 13040 1726882407.76709: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000083 13040 1726882407.76714: WORKER PROCESS EXITING 13040 1726882407.76738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882407.76960: done with get_vars() 13040 1726882407.76972: done getting variables 13040 1726882407.77016: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:33:27 -0400 (0:00:00.047) 0:00:05.247 ****** 13040 1726882407.77043: entering _queue_task() for managed_node1/package 13040 1726882407.77233: worker is 1 (out of 1 available) 13040 1726882407.77246: exiting _queue_task() for managed_node1/package 13040 1726882407.77261: done queuing things up, now waiting for results queue to drain 13040 1726882407.77263: waiting for pending results... 13040 1726882407.77453: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13040 1726882407.77592: in run() - task 0e448fcc-3ce9-b123-314b-000000000084 13040 1726882407.77613: variable 'ansible_search_path' from source: unknown 13040 1726882407.77627: variable 'ansible_search_path' from source: unknown 13040 1726882407.77666: calling self._execute() 13040 1726882407.77753: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882407.77767: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882407.77782: variable 'omit' from source: magic vars 13040 1726882407.78205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882407.80586: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882407.80655: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882407.80700: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882407.80738: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882407.80771: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882407.80849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882407.80890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882407.80998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882407.81044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882407.81067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882407.81744: variable 'ansible_distribution' from source: facts 13040 1726882407.81749: variable 'ansible_distribution_major_version' from source: facts 13040 1726882407.81775: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882407.81783: when evaluation is False, skipping this task 13040 1726882407.81791: _execute() done 13040 1726882407.81798: dumping result to json 13040 1726882407.81805: done dumping result, returning 13040 1726882407.81817: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-b123-314b-000000000084] 13040 1726882407.81827: sending task result for task 0e448fcc-3ce9-b123-314b-000000000084 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882407.81977: no more pending results, returning what we have 13040 1726882407.81981: results queue empty 13040 1726882407.81982: checking for any_errors_fatal 13040 1726882407.81989: done checking for any_errors_fatal 13040 1726882407.81990: checking for max_fail_percentage 13040 1726882407.81992: done checking for max_fail_percentage 13040 1726882407.81993: checking to see if all hosts have failed and the running result is not ok 13040 1726882407.81994: done checking to see if all hosts have failed 13040 1726882407.81995: getting the remaining hosts for this loop 13040 1726882407.81996: done getting the remaining hosts for this loop 13040 1726882407.82000: getting the next task for host managed_node1 13040 1726882407.82008: done getting next task for host managed_node1 13040 1726882407.82013: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13040 1726882407.82016: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882407.82036: getting variables 13040 1726882407.82038: in VariableManager get_vars() 13040 1726882407.82100: Calling all_inventory to load vars for managed_node1 13040 1726882407.82103: Calling groups_inventory to load vars for managed_node1 13040 1726882407.82108: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882407.82118: Calling all_plugins_play to load vars for managed_node1 13040 1726882407.82122: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882407.82126: Calling groups_plugins_play to load vars for managed_node1 13040 1726882407.82336: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000084 13040 1726882407.82339: WORKER PROCESS EXITING 13040 1726882407.82370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882407.82652: done with get_vars() 13040 1726882407.82668: done getting variables 13040 1726882407.82726: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:33:27 -0400 (0:00:00.057) 0:00:05.304 ****** 13040 1726882407.82760: entering _queue_task() for managed_node1/package 13040 1726882407.83012: worker is 1 (out of 1 available) 13040 1726882407.83023: exiting _queue_task() for managed_node1/package 13040 1726882407.83035: done queuing things up, now waiting for results queue to drain 13040 1726882407.83037: waiting for pending results... 13040 1726882407.83784: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13040 1726882407.84220: in run() - task 0e448fcc-3ce9-b123-314b-000000000085 13040 1726882407.84241: variable 'ansible_search_path' from source: unknown 13040 1726882407.84249: variable 'ansible_search_path' from source: unknown 13040 1726882407.84299: calling self._execute() 13040 1726882407.84392: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882407.84418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882407.84434: variable 'omit' from source: magic vars 13040 1726882407.85175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882407.88049: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882407.88137: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882407.88184: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882407.88223: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882407.88257: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882407.88329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882407.88372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882407.88387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882407.88414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882407.88425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882407.88524: variable 'ansible_distribution' from source: facts 13040 1726882407.88528: variable 'ansible_distribution_major_version' from source: facts 13040 1726882407.88550: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882407.88556: when evaluation is False, skipping this task 13040 1726882407.88559: _execute() done 13040 1726882407.88561: dumping result to json 13040 1726882407.88568: done dumping result, returning 13040 1726882407.88571: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-b123-314b-000000000085] 13040 1726882407.88573: sending task result for task 0e448fcc-3ce9-b123-314b-000000000085 13040 1726882407.88670: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000085 13040 1726882407.88672: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882407.88726: no more pending results, returning what we have 13040 1726882407.88730: results queue empty 13040 1726882407.88730: checking for any_errors_fatal 13040 1726882407.88738: done checking for any_errors_fatal 13040 1726882407.88738: checking for max_fail_percentage 13040 1726882407.88740: done checking for max_fail_percentage 13040 1726882407.88741: checking to see if all hosts have failed and the running result is not ok 13040 1726882407.88741: done checking to see if all hosts have failed 13040 1726882407.88742: getting the remaining hosts for this loop 13040 1726882407.88743: done getting the remaining hosts for this loop 13040 1726882407.88747: getting the next task for host managed_node1 13040 1726882407.88756: done getting next task for host managed_node1 13040 1726882407.88760: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13040 1726882407.88763: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882407.88781: getting variables 13040 1726882407.88784: in VariableManager get_vars() 13040 1726882407.88830: Calling all_inventory to load vars for managed_node1 13040 1726882407.88832: Calling groups_inventory to load vars for managed_node1 13040 1726882407.88834: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882407.88842: Calling all_plugins_play to load vars for managed_node1 13040 1726882407.88845: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882407.88847: Calling groups_plugins_play to load vars for managed_node1 13040 1726882407.88966: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882407.89095: done with get_vars() 13040 1726882407.89103: done getting variables 13040 1726882407.89143: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:33:27 -0400 (0:00:00.064) 0:00:05.368 ****** 13040 1726882407.89170: entering _queue_task() for managed_node1/service 13040 1726882407.89355: worker is 1 (out of 1 available) 13040 1726882407.89370: exiting _queue_task() for managed_node1/service 13040 1726882407.89381: done queuing things up, now waiting for results queue to drain 13040 1726882407.89383: waiting for pending results... 13040 1726882407.89546: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13040 1726882407.89628: in run() - task 0e448fcc-3ce9-b123-314b-000000000086 13040 1726882407.89638: variable 'ansible_search_path' from source: unknown 13040 1726882407.89641: variable 'ansible_search_path' from source: unknown 13040 1726882407.89672: calling self._execute() 13040 1726882407.89736: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882407.89740: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882407.89748: variable 'omit' from source: magic vars 13040 1726882407.90042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882407.92288: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882407.92358: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882407.92403: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882407.92442: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882407.92475: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882407.92555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882407.92591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882407.92621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882407.92675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882407.92697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882407.92829: variable 'ansible_distribution' from source: facts 13040 1726882407.92840: variable 'ansible_distribution_major_version' from source: facts 13040 1726882407.92863: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882407.92868: when evaluation is False, skipping this task 13040 1726882407.92871: _execute() done 13040 1726882407.92873: dumping result to json 13040 1726882407.92875: done dumping result, returning 13040 1726882407.92883: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-b123-314b-000000000086] 13040 1726882407.92892: sending task result for task 0e448fcc-3ce9-b123-314b-000000000086 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882407.93040: no more pending results, returning what we have 13040 1726882407.93044: results queue empty 13040 1726882407.93045: checking for any_errors_fatal 13040 1726882407.93053: done checking for any_errors_fatal 13040 1726882407.93054: checking for max_fail_percentage 13040 1726882407.93056: done checking for max_fail_percentage 13040 1726882407.93056: checking to see if all hosts have failed and the running result is not ok 13040 1726882407.93057: done checking to see if all hosts have failed 13040 1726882407.93058: getting the remaining hosts for this loop 13040 1726882407.93059: done getting the remaining hosts for this loop 13040 1726882407.93065: getting the next task for host managed_node1 13040 1726882407.93071: done getting next task for host managed_node1 13040 1726882407.93076: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13040 1726882407.93078: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882407.93095: getting variables 13040 1726882407.93096: in VariableManager get_vars() 13040 1726882407.93147: Calling all_inventory to load vars for managed_node1 13040 1726882407.93149: Calling groups_inventory to load vars for managed_node1 13040 1726882407.93154: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882407.93166: Calling all_plugins_play to load vars for managed_node1 13040 1726882407.93168: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882407.93171: Calling groups_plugins_play to load vars for managed_node1 13040 1726882407.93329: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000086 13040 1726882407.93333: WORKER PROCESS EXITING 13040 1726882407.93421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882407.93645: done with get_vars() 13040 1726882407.93658: done getting variables 13040 1726882407.93725: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:33:27 -0400 (0:00:00.045) 0:00:05.414 ****** 13040 1726882407.93758: entering _queue_task() for managed_node1/service 13040 1726882407.94057: worker is 1 (out of 1 available) 13040 1726882407.94070: exiting _queue_task() for managed_node1/service 13040 1726882407.94084: done queuing things up, now waiting for results queue to drain 13040 1726882407.94085: waiting for pending results... 13040 1726882407.94421: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13040 1726882407.94594: in run() - task 0e448fcc-3ce9-b123-314b-000000000087 13040 1726882407.94615: variable 'ansible_search_path' from source: unknown 13040 1726882407.94623: variable 'ansible_search_path' from source: unknown 13040 1726882407.94678: calling self._execute() 13040 1726882407.94795: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882407.94814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882407.94831: variable 'omit' from source: magic vars 13040 1726882407.95303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882407.97809: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882407.97910: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882407.97956: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882407.98003: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882407.98039: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882407.98132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882407.98172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882407.98212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882407.98268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882407.98296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882407.98458: variable 'ansible_distribution' from source: facts 13040 1726882407.98472: variable 'ansible_distribution_major_version' from source: facts 13040 1726882407.98494: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882407.98504: when evaluation is False, skipping this task 13040 1726882407.98516: _execute() done 13040 1726882407.98525: dumping result to json 13040 1726882407.98532: done dumping result, returning 13040 1726882407.98544: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-b123-314b-000000000087] 13040 1726882407.98561: sending task result for task 0e448fcc-3ce9-b123-314b-000000000087 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13040 1726882407.98724: no more pending results, returning what we have 13040 1726882407.98728: results queue empty 13040 1726882407.98729: checking for any_errors_fatal 13040 1726882407.98737: done checking for any_errors_fatal 13040 1726882407.98738: checking for max_fail_percentage 13040 1726882407.98741: done checking for max_fail_percentage 13040 1726882407.98742: checking to see if all hosts have failed and the running result is not ok 13040 1726882407.98743: done checking to see if all hosts have failed 13040 1726882407.98744: getting the remaining hosts for this loop 13040 1726882407.98745: done getting the remaining hosts for this loop 13040 1726882407.98749: getting the next task for host managed_node1 13040 1726882407.98759: done getting next task for host managed_node1 13040 1726882407.98765: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13040 1726882407.98767: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882407.98788: getting variables 13040 1726882407.98790: in VariableManager get_vars() 13040 1726882407.98848: Calling all_inventory to load vars for managed_node1 13040 1726882407.98854: Calling groups_inventory to load vars for managed_node1 13040 1726882407.98857: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882407.98869: Calling all_plugins_play to load vars for managed_node1 13040 1726882407.98873: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882407.98876: Calling groups_plugins_play to load vars for managed_node1 13040 1726882407.99070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882407.99358: done with get_vars() 13040 1726882407.99373: done getting variables 13040 1726882407.99521: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000087 13040 1726882407.99525: WORKER PROCESS EXITING 13040 1726882407.99567: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:33:27 -0400 (0:00:00.058) 0:00:05.473 ****** 13040 1726882407.99605: entering _queue_task() for managed_node1/service 13040 1726882408.00081: worker is 1 (out of 1 available) 13040 1726882408.00092: exiting _queue_task() for managed_node1/service 13040 1726882408.00104: done queuing things up, now waiting for results queue to drain 13040 1726882408.00106: waiting for pending results... 13040 1726882408.00416: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13040 1726882408.00562: in run() - task 0e448fcc-3ce9-b123-314b-000000000088 13040 1726882408.00584: variable 'ansible_search_path' from source: unknown 13040 1726882408.00592: variable 'ansible_search_path' from source: unknown 13040 1726882408.00639: calling self._execute() 13040 1726882408.00738: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882408.00750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882408.00773: variable 'omit' from source: magic vars 13040 1726882408.01245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882408.03961: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882408.04047: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882408.04095: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882408.04142: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882408.04180: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882408.04273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882408.04308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882408.04346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882408.04400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882408.04420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882408.04592: variable 'ansible_distribution' from source: facts 13040 1726882408.04604: variable 'ansible_distribution_major_version' from source: facts 13040 1726882408.04624: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882408.04631: when evaluation is False, skipping this task 13040 1726882408.04637: _execute() done 13040 1726882408.04651: dumping result to json 13040 1726882408.04662: done dumping result, returning 13040 1726882408.04677: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-b123-314b-000000000088] 13040 1726882408.04687: sending task result for task 0e448fcc-3ce9-b123-314b-000000000088 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882408.04846: no more pending results, returning what we have 13040 1726882408.04850: results queue empty 13040 1726882408.04854: checking for any_errors_fatal 13040 1726882408.04861: done checking for any_errors_fatal 13040 1726882408.04862: checking for max_fail_percentage 13040 1726882408.04866: done checking for max_fail_percentage 13040 1726882408.04867: checking to see if all hosts have failed and the running result is not ok 13040 1726882408.04868: done checking to see if all hosts have failed 13040 1726882408.04869: getting the remaining hosts for this loop 13040 1726882408.04870: done getting the remaining hosts for this loop 13040 1726882408.04874: getting the next task for host managed_node1 13040 1726882408.04881: done getting next task for host managed_node1 13040 1726882408.04885: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 13040 1726882408.04888: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882408.04908: getting variables 13040 1726882408.04910: in VariableManager get_vars() 13040 1726882408.04968: Calling all_inventory to load vars for managed_node1 13040 1726882408.04972: Calling groups_inventory to load vars for managed_node1 13040 1726882408.04974: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882408.04985: Calling all_plugins_play to load vars for managed_node1 13040 1726882408.04988: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882408.04991: Calling groups_plugins_play to load vars for managed_node1 13040 1726882408.05249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882408.05481: done with get_vars() 13040 1726882408.05492: done getting variables 13040 1726882408.05640: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000088 13040 1726882408.05643: WORKER PROCESS EXITING 13040 1726882408.05685: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:33:28 -0400 (0:00:00.061) 0:00:05.534 ****** 13040 1726882408.05834: entering _queue_task() for managed_node1/service 13040 1726882408.06193: worker is 1 (out of 1 available) 13040 1726882408.06208: exiting _queue_task() for managed_node1/service 13040 1726882408.06218: done queuing things up, now waiting for results queue to drain 13040 1726882408.06219: waiting for pending results... 13040 1726882408.06509: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 13040 1726882408.06664: in run() - task 0e448fcc-3ce9-b123-314b-000000000089 13040 1726882408.06684: variable 'ansible_search_path' from source: unknown 13040 1726882408.06692: variable 'ansible_search_path' from source: unknown 13040 1726882408.06735: calling self._execute() 13040 1726882408.06835: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882408.06845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882408.06870: variable 'omit' from source: magic vars 13040 1726882408.07573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882408.10878: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882408.10967: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882408.11011: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882408.11050: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882408.11083: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882408.11167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882408.11201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882408.11225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882408.11272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882408.11291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882408.11446: variable 'ansible_distribution' from source: facts 13040 1726882408.11450: variable 'ansible_distribution_major_version' from source: facts 13040 1726882408.11476: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882408.11480: when evaluation is False, skipping this task 13040 1726882408.11483: _execute() done 13040 1726882408.11485: dumping result to json 13040 1726882408.11488: done dumping result, returning 13040 1726882408.11497: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-b123-314b-000000000089] 13040 1726882408.11502: sending task result for task 0e448fcc-3ce9-b123-314b-000000000089 13040 1726882408.11611: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000089 13040 1726882408.11614: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13040 1726882408.11657: no more pending results, returning what we have 13040 1726882408.11661: results queue empty 13040 1726882408.11661: checking for any_errors_fatal 13040 1726882408.11672: done checking for any_errors_fatal 13040 1726882408.11672: checking for max_fail_percentage 13040 1726882408.11674: done checking for max_fail_percentage 13040 1726882408.11675: checking to see if all hosts have failed and the running result is not ok 13040 1726882408.11676: done checking to see if all hosts have failed 13040 1726882408.11677: getting the remaining hosts for this loop 13040 1726882408.11678: done getting the remaining hosts for this loop 13040 1726882408.11681: getting the next task for host managed_node1 13040 1726882408.11687: done getting next task for host managed_node1 13040 1726882408.11691: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13040 1726882408.11693: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882408.11710: getting variables 13040 1726882408.11712: in VariableManager get_vars() 13040 1726882408.11766: Calling all_inventory to load vars for managed_node1 13040 1726882408.11769: Calling groups_inventory to load vars for managed_node1 13040 1726882408.11771: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882408.11780: Calling all_plugins_play to load vars for managed_node1 13040 1726882408.11782: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882408.11785: Calling groups_plugins_play to load vars for managed_node1 13040 1726882408.12014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882408.12170: done with get_vars() 13040 1726882408.12184: done getting variables 13040 1726882408.12237: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:33:28 -0400 (0:00:00.065) 0:00:05.599 ****** 13040 1726882408.12270: entering _queue_task() for managed_node1/copy 13040 1726882408.12497: worker is 1 (out of 1 available) 13040 1726882408.12510: exiting _queue_task() for managed_node1/copy 13040 1726882408.12522: done queuing things up, now waiting for results queue to drain 13040 1726882408.12524: waiting for pending results... 13040 1726882408.12760: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13040 1726882408.12903: in run() - task 0e448fcc-3ce9-b123-314b-00000000008a 13040 1726882408.12928: variable 'ansible_search_path' from source: unknown 13040 1726882408.12936: variable 'ansible_search_path' from source: unknown 13040 1726882408.12980: calling self._execute() 13040 1726882408.13082: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882408.13092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882408.13105: variable 'omit' from source: magic vars 13040 1726882408.13537: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882408.16565: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882408.16614: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882408.16642: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882408.16672: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882408.16692: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882408.16751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882408.16776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882408.16793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882408.16821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882408.16832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882408.16932: variable 'ansible_distribution' from source: facts 13040 1726882408.16937: variable 'ansible_distribution_major_version' from source: facts 13040 1726882408.16953: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882408.16959: when evaluation is False, skipping this task 13040 1726882408.16961: _execute() done 13040 1726882408.16965: dumping result to json 13040 1726882408.16969: done dumping result, returning 13040 1726882408.16977: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-b123-314b-00000000008a] 13040 1726882408.16982: sending task result for task 0e448fcc-3ce9-b123-314b-00000000008a 13040 1726882408.17074: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000008a 13040 1726882408.17077: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882408.17118: no more pending results, returning what we have 13040 1726882408.17121: results queue empty 13040 1726882408.17122: checking for any_errors_fatal 13040 1726882408.17127: done checking for any_errors_fatal 13040 1726882408.17128: checking for max_fail_percentage 13040 1726882408.17130: done checking for max_fail_percentage 13040 1726882408.17131: checking to see if all hosts have failed and the running result is not ok 13040 1726882408.17132: done checking to see if all hosts have failed 13040 1726882408.17132: getting the remaining hosts for this loop 13040 1726882408.17134: done getting the remaining hosts for this loop 13040 1726882408.17137: getting the next task for host managed_node1 13040 1726882408.17143: done getting next task for host managed_node1 13040 1726882408.17147: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13040 1726882408.17149: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882408.17172: getting variables 13040 1726882408.17174: in VariableManager get_vars() 13040 1726882408.17225: Calling all_inventory to load vars for managed_node1 13040 1726882408.17228: Calling groups_inventory to load vars for managed_node1 13040 1726882408.17230: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882408.17239: Calling all_plugins_play to load vars for managed_node1 13040 1726882408.17241: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882408.17244: Calling groups_plugins_play to load vars for managed_node1 13040 1726882408.17412: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882408.17535: done with get_vars() 13040 1726882408.17543: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:33:28 -0400 (0:00:00.053) 0:00:05.653 ****** 13040 1726882408.17603: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 13040 1726882408.17839: worker is 1 (out of 1 available) 13040 1726882408.17855: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 13040 1726882408.17870: done queuing things up, now waiting for results queue to drain 13040 1726882408.17871: waiting for pending results... 13040 1726882408.19045: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13040 1726882408.19098: in run() - task 0e448fcc-3ce9-b123-314b-00000000008b 13040 1726882408.19129: variable 'ansible_search_path' from source: unknown 13040 1726882408.19132: variable 'ansible_search_path' from source: unknown 13040 1726882408.19154: calling self._execute() 13040 1726882408.19229: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882408.19232: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882408.19242: variable 'omit' from source: magic vars 13040 1726882408.19661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882408.21516: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882408.21573: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882408.21600: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882408.21624: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882408.21647: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882408.21705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882408.21725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882408.21749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882408.21775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882408.21786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882408.21879: variable 'ansible_distribution' from source: facts 13040 1726882408.21884: variable 'ansible_distribution_major_version' from source: facts 13040 1726882408.21898: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882408.21901: when evaluation is False, skipping this task 13040 1726882408.21903: _execute() done 13040 1726882408.21905: dumping result to json 13040 1726882408.21909: done dumping result, returning 13040 1726882408.21917: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-b123-314b-00000000008b] 13040 1726882408.21921: sending task result for task 0e448fcc-3ce9-b123-314b-00000000008b 13040 1726882408.22015: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000008b 13040 1726882408.22018: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882408.22069: no more pending results, returning what we have 13040 1726882408.22073: results queue empty 13040 1726882408.22074: checking for any_errors_fatal 13040 1726882408.22080: done checking for any_errors_fatal 13040 1726882408.22081: checking for max_fail_percentage 13040 1726882408.22082: done checking for max_fail_percentage 13040 1726882408.22083: checking to see if all hosts have failed and the running result is not ok 13040 1726882408.22084: done checking to see if all hosts have failed 13040 1726882408.22084: getting the remaining hosts for this loop 13040 1726882408.22086: done getting the remaining hosts for this loop 13040 1726882408.22089: getting the next task for host managed_node1 13040 1726882408.22095: done getting next task for host managed_node1 13040 1726882408.22099: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 13040 1726882408.22102: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882408.22119: getting variables 13040 1726882408.22120: in VariableManager get_vars() 13040 1726882408.22170: Calling all_inventory to load vars for managed_node1 13040 1726882408.22173: Calling groups_inventory to load vars for managed_node1 13040 1726882408.22175: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882408.22183: Calling all_plugins_play to load vars for managed_node1 13040 1726882408.22185: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882408.22187: Calling groups_plugins_play to load vars for managed_node1 13040 1726882408.22305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882408.22433: done with get_vars() 13040 1726882408.22441: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:33:28 -0400 (0:00:00.048) 0:00:05.702 ****** 13040 1726882408.22501: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 13040 1726882408.22790: worker is 1 (out of 1 available) 13040 1726882408.22802: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 13040 1726882408.22814: done queuing things up, now waiting for results queue to drain 13040 1726882408.22815: waiting for pending results... 13040 1726882408.23098: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 13040 1726882408.23224: in run() - task 0e448fcc-3ce9-b123-314b-00000000008c 13040 1726882408.23242: variable 'ansible_search_path' from source: unknown 13040 1726882408.23249: variable 'ansible_search_path' from source: unknown 13040 1726882408.23299: calling self._execute() 13040 1726882408.23394: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882408.23405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882408.23420: variable 'omit' from source: magic vars 13040 1726882408.23859: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882408.25468: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882408.25512: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882408.25539: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882408.25567: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882408.25586: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882408.25641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882408.25663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882408.25682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882408.25707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882408.25721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882408.25810: variable 'ansible_distribution' from source: facts 13040 1726882408.25814: variable 'ansible_distribution_major_version' from source: facts 13040 1726882408.25831: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882408.25834: when evaluation is False, skipping this task 13040 1726882408.25836: _execute() done 13040 1726882408.25839: dumping result to json 13040 1726882408.25841: done dumping result, returning 13040 1726882408.25847: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-b123-314b-00000000008c] 13040 1726882408.25855: sending task result for task 0e448fcc-3ce9-b123-314b-00000000008c 13040 1726882408.25939: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000008c 13040 1726882408.25941: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882408.26009: no more pending results, returning what we have 13040 1726882408.26013: results queue empty 13040 1726882408.26014: checking for any_errors_fatal 13040 1726882408.26021: done checking for any_errors_fatal 13040 1726882408.26022: checking for max_fail_percentage 13040 1726882408.26023: done checking for max_fail_percentage 13040 1726882408.26024: checking to see if all hosts have failed and the running result is not ok 13040 1726882408.26025: done checking to see if all hosts have failed 13040 1726882408.26025: getting the remaining hosts for this loop 13040 1726882408.26026: done getting the remaining hosts for this loop 13040 1726882408.26030: getting the next task for host managed_node1 13040 1726882408.26035: done getting next task for host managed_node1 13040 1726882408.26040: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13040 1726882408.26042: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882408.26063: getting variables 13040 1726882408.26065: in VariableManager get_vars() 13040 1726882408.26111: Calling all_inventory to load vars for managed_node1 13040 1726882408.26114: Calling groups_inventory to load vars for managed_node1 13040 1726882408.26116: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882408.26124: Calling all_plugins_play to load vars for managed_node1 13040 1726882408.26126: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882408.26128: Calling groups_plugins_play to load vars for managed_node1 13040 1726882408.26280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882408.26402: done with get_vars() 13040 1726882408.26409: done getting variables 13040 1726882408.26450: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:33:28 -0400 (0:00:00.039) 0:00:05.741 ****** 13040 1726882408.26476: entering _queue_task() for managed_node1/debug 13040 1726882408.26666: worker is 1 (out of 1 available) 13040 1726882408.26680: exiting _queue_task() for managed_node1/debug 13040 1726882408.26691: done queuing things up, now waiting for results queue to drain 13040 1726882408.26693: waiting for pending results... 13040 1726882408.26856: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13040 1726882408.26941: in run() - task 0e448fcc-3ce9-b123-314b-00000000008d 13040 1726882408.26955: variable 'ansible_search_path' from source: unknown 13040 1726882408.26958: variable 'ansible_search_path' from source: unknown 13040 1726882408.26989: calling self._execute() 13040 1726882408.27043: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882408.27048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882408.27057: variable 'omit' from source: magic vars 13040 1726882408.27351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882408.28907: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882408.28960: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882408.28989: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882408.29014: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882408.29033: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882408.29094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882408.29114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882408.29131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882408.29161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882408.29172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882408.29265: variable 'ansible_distribution' from source: facts 13040 1726882408.29269: variable 'ansible_distribution_major_version' from source: facts 13040 1726882408.29286: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882408.29289: when evaluation is False, skipping this task 13040 1726882408.29291: _execute() done 13040 1726882408.29293: dumping result to json 13040 1726882408.29295: done dumping result, returning 13040 1726882408.29304: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-b123-314b-00000000008d] 13040 1726882408.29309: sending task result for task 0e448fcc-3ce9-b123-314b-00000000008d 13040 1726882408.29394: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000008d 13040 1726882408.29397: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 13040 1726882408.29448: no more pending results, returning what we have 13040 1726882408.29454: results queue empty 13040 1726882408.29455: checking for any_errors_fatal 13040 1726882408.29463: done checking for any_errors_fatal 13040 1726882408.29465: checking for max_fail_percentage 13040 1726882408.29467: done checking for max_fail_percentage 13040 1726882408.29468: checking to see if all hosts have failed and the running result is not ok 13040 1726882408.29468: done checking to see if all hosts have failed 13040 1726882408.29469: getting the remaining hosts for this loop 13040 1726882408.29471: done getting the remaining hosts for this loop 13040 1726882408.29474: getting the next task for host managed_node1 13040 1726882408.29479: done getting next task for host managed_node1 13040 1726882408.29484: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13040 1726882408.29486: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882408.29502: getting variables 13040 1726882408.29504: in VariableManager get_vars() 13040 1726882408.29557: Calling all_inventory to load vars for managed_node1 13040 1726882408.29560: Calling groups_inventory to load vars for managed_node1 13040 1726882408.29562: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882408.29572: Calling all_plugins_play to load vars for managed_node1 13040 1726882408.29574: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882408.29576: Calling groups_plugins_play to load vars for managed_node1 13040 1726882408.29688: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882408.29814: done with get_vars() 13040 1726882408.29822: done getting variables 13040 1726882408.29868: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:33:28 -0400 (0:00:00.034) 0:00:05.776 ****** 13040 1726882408.29890: entering _queue_task() for managed_node1/debug 13040 1726882408.30077: worker is 1 (out of 1 available) 13040 1726882408.30090: exiting _queue_task() for managed_node1/debug 13040 1726882408.30102: done queuing things up, now waiting for results queue to drain 13040 1726882408.30103: waiting for pending results... 13040 1726882408.30275: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13040 1726882408.30353: in run() - task 0e448fcc-3ce9-b123-314b-00000000008e 13040 1726882408.30362: variable 'ansible_search_path' from source: unknown 13040 1726882408.30370: variable 'ansible_search_path' from source: unknown 13040 1726882408.30399: calling self._execute() 13040 1726882408.30458: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882408.30464: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882408.30472: variable 'omit' from source: magic vars 13040 1726882408.30762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882408.32329: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882408.32376: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882408.32402: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882408.32426: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882408.32446: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882408.32503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882408.32523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882408.32540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882408.32573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882408.32583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882408.32675: variable 'ansible_distribution' from source: facts 13040 1726882408.32679: variable 'ansible_distribution_major_version' from source: facts 13040 1726882408.32693: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882408.32696: when evaluation is False, skipping this task 13040 1726882408.32698: _execute() done 13040 1726882408.32700: dumping result to json 13040 1726882408.32703: done dumping result, returning 13040 1726882408.32709: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-b123-314b-00000000008e] 13040 1726882408.32714: sending task result for task 0e448fcc-3ce9-b123-314b-00000000008e 13040 1726882408.32800: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000008e 13040 1726882408.32803: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 13040 1726882408.32874: no more pending results, returning what we have 13040 1726882408.32881: results queue empty 13040 1726882408.32882: checking for any_errors_fatal 13040 1726882408.32888: done checking for any_errors_fatal 13040 1726882408.32888: checking for max_fail_percentage 13040 1726882408.32890: done checking for max_fail_percentage 13040 1726882408.32891: checking to see if all hosts have failed and the running result is not ok 13040 1726882408.32891: done checking to see if all hosts have failed 13040 1726882408.32892: getting the remaining hosts for this loop 13040 1726882408.32893: done getting the remaining hosts for this loop 13040 1726882408.32896: getting the next task for host managed_node1 13040 1726882408.32902: done getting next task for host managed_node1 13040 1726882408.32906: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13040 1726882408.32908: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882408.32927: getting variables 13040 1726882408.32929: in VariableManager get_vars() 13040 1726882408.32971: Calling all_inventory to load vars for managed_node1 13040 1726882408.32973: Calling groups_inventory to load vars for managed_node1 13040 1726882408.32975: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882408.32981: Calling all_plugins_play to load vars for managed_node1 13040 1726882408.32985: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882408.32987: Calling groups_plugins_play to load vars for managed_node1 13040 1726882408.33132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882408.33255: done with get_vars() 13040 1726882408.33262: done getting variables 13040 1726882408.33301: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:33:28 -0400 (0:00:00.034) 0:00:05.810 ****** 13040 1726882408.33323: entering _queue_task() for managed_node1/debug 13040 1726882408.33508: worker is 1 (out of 1 available) 13040 1726882408.33520: exiting _queue_task() for managed_node1/debug 13040 1726882408.33532: done queuing things up, now waiting for results queue to drain 13040 1726882408.33533: waiting for pending results... 13040 1726882408.33703: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13040 1726882408.33790: in run() - task 0e448fcc-3ce9-b123-314b-00000000008f 13040 1726882408.33799: variable 'ansible_search_path' from source: unknown 13040 1726882408.33802: variable 'ansible_search_path' from source: unknown 13040 1726882408.33828: calling self._execute() 13040 1726882408.33893: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882408.33897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882408.33906: variable 'omit' from source: magic vars 13040 1726882408.34198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882408.35746: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882408.35800: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882408.35827: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882408.35855: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882408.35876: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882408.35932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882408.35955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882408.35980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882408.36006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882408.36016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882408.36118: variable 'ansible_distribution' from source: facts 13040 1726882408.36123: variable 'ansible_distribution_major_version' from source: facts 13040 1726882408.36137: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882408.36142: when evaluation is False, skipping this task 13040 1726882408.36145: _execute() done 13040 1726882408.36148: dumping result to json 13040 1726882408.36156: done dumping result, returning 13040 1726882408.36160: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-b123-314b-00000000008f] 13040 1726882408.36167: sending task result for task 0e448fcc-3ce9-b123-314b-00000000008f 13040 1726882408.36249: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000008f 13040 1726882408.36252: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 13040 1726882408.36315: no more pending results, returning what we have 13040 1726882408.36319: results queue empty 13040 1726882408.36320: checking for any_errors_fatal 13040 1726882408.36326: done checking for any_errors_fatal 13040 1726882408.36327: checking for max_fail_percentage 13040 1726882408.36329: done checking for max_fail_percentage 13040 1726882408.36330: checking to see if all hosts have failed and the running result is not ok 13040 1726882408.36331: done checking to see if all hosts have failed 13040 1726882408.36331: getting the remaining hosts for this loop 13040 1726882408.36332: done getting the remaining hosts for this loop 13040 1726882408.36336: getting the next task for host managed_node1 13040 1726882408.36342: done getting next task for host managed_node1 13040 1726882408.36346: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 13040 1726882408.36348: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882408.36367: getting variables 13040 1726882408.36370: in VariableManager get_vars() 13040 1726882408.36420: Calling all_inventory to load vars for managed_node1 13040 1726882408.36423: Calling groups_inventory to load vars for managed_node1 13040 1726882408.36425: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882408.36433: Calling all_plugins_play to load vars for managed_node1 13040 1726882408.36435: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882408.36437: Calling groups_plugins_play to load vars for managed_node1 13040 1726882408.36547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882408.36675: done with get_vars() 13040 1726882408.36683: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:33:28 -0400 (0:00:00.034) 0:00:05.844 ****** 13040 1726882408.36748: entering _queue_task() for managed_node1/ping 13040 1726882408.36932: worker is 1 (out of 1 available) 13040 1726882408.36946: exiting _queue_task() for managed_node1/ping 13040 1726882408.36958: done queuing things up, now waiting for results queue to drain 13040 1726882408.36959: waiting for pending results... 13040 1726882408.37136: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 13040 1726882408.37213: in run() - task 0e448fcc-3ce9-b123-314b-000000000090 13040 1726882408.37230: variable 'ansible_search_path' from source: unknown 13040 1726882408.37233: variable 'ansible_search_path' from source: unknown 13040 1726882408.37286: calling self._execute() 13040 1726882408.37348: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882408.37354: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882408.37366: variable 'omit' from source: magic vars 13040 1726882408.37668: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882408.39890: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882408.39939: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882408.39969: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882408.39994: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882408.40013: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882408.40075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882408.40095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882408.40112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882408.40138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882408.40155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882408.40258: variable 'ansible_distribution' from source: facts 13040 1726882408.40270: variable 'ansible_distribution_major_version' from source: facts 13040 1726882408.40283: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882408.40286: when evaluation is False, skipping this task 13040 1726882408.40288: _execute() done 13040 1726882408.40291: dumping result to json 13040 1726882408.40293: done dumping result, returning 13040 1726882408.40301: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-b123-314b-000000000090] 13040 1726882408.40306: sending task result for task 0e448fcc-3ce9-b123-314b-000000000090 13040 1726882408.40391: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000090 13040 1726882408.40394: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882408.40436: no more pending results, returning what we have 13040 1726882408.40439: results queue empty 13040 1726882408.40440: checking for any_errors_fatal 13040 1726882408.40446: done checking for any_errors_fatal 13040 1726882408.40447: checking for max_fail_percentage 13040 1726882408.40449: done checking for max_fail_percentage 13040 1726882408.40449: checking to see if all hosts have failed and the running result is not ok 13040 1726882408.40450: done checking to see if all hosts have failed 13040 1726882408.40451: getting the remaining hosts for this loop 13040 1726882408.40454: done getting the remaining hosts for this loop 13040 1726882408.40458: getting the next task for host managed_node1 13040 1726882408.40468: done getting next task for host managed_node1 13040 1726882408.40471: ^ task is: TASK: meta (role_complete) 13040 1726882408.40473: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882408.40490: getting variables 13040 1726882408.40493: in VariableManager get_vars() 13040 1726882408.40541: Calling all_inventory to load vars for managed_node1 13040 1726882408.40544: Calling groups_inventory to load vars for managed_node1 13040 1726882408.40547: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882408.40557: Calling all_plugins_play to load vars for managed_node1 13040 1726882408.40560: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882408.40563: Calling groups_plugins_play to load vars for managed_node1 13040 1726882408.40691: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882408.40855: done with get_vars() 13040 1726882408.40862: done getting variables 13040 1726882408.40919: done queuing things up, now waiting for results queue to drain 13040 1726882408.40921: results queue empty 13040 1726882408.40921: checking for any_errors_fatal 13040 1726882408.40923: done checking for any_errors_fatal 13040 1726882408.40923: checking for max_fail_percentage 13040 1726882408.40924: done checking for max_fail_percentage 13040 1726882408.40924: checking to see if all hosts have failed and the running result is not ok 13040 1726882408.40925: done checking to see if all hosts have failed 13040 1726882408.40925: getting the remaining hosts for this loop 13040 1726882408.40926: done getting the remaining hosts for this loop 13040 1726882408.40927: getting the next task for host managed_node1 13040 1726882408.40930: done getting next task for host managed_node1 13040 1726882408.40932: ^ task is: TASK: From the active connection, get the port1 profile "{{ port1_profile }}" 13040 1726882408.40933: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882408.40935: getting variables 13040 1726882408.40936: in VariableManager get_vars() 13040 1726882408.40949: Calling all_inventory to load vars for managed_node1 13040 1726882408.40950: Calling groups_inventory to load vars for managed_node1 13040 1726882408.40954: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882408.40957: Calling all_plugins_play to load vars for managed_node1 13040 1726882408.40958: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882408.40960: Calling groups_plugins_play to load vars for managed_node1 13040 1726882408.41039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882408.41162: done with get_vars() 13040 1726882408.41169: done getting variables 13040 1726882408.41194: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13040 1726882408.41286: variable 'port1_profile' from source: play vars TASK [From the active connection, get the port1 profile "bond0.0"] ************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:104 Friday 20 September 2024 21:33:28 -0400 (0:00:00.045) 0:00:05.890 ****** 13040 1726882408.41305: entering _queue_task() for managed_node1/command 13040 1726882408.41621: worker is 1 (out of 1 available) 13040 1726882408.41634: exiting _queue_task() for managed_node1/command 13040 1726882408.41644: done queuing things up, now waiting for results queue to drain 13040 1726882408.41646: waiting for pending results... 13040 1726882408.42077: running TaskExecutor() for managed_node1/TASK: From the active connection, get the port1 profile "bond0.0" 13040 1726882408.42199: in run() - task 0e448fcc-3ce9-b123-314b-0000000000c0 13040 1726882408.42217: variable 'ansible_search_path' from source: unknown 13040 1726882408.42260: calling self._execute() 13040 1726882408.42398: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882408.42412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882408.42426: variable 'omit' from source: magic vars 13040 1726882408.42971: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882408.45118: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882408.45163: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882408.45191: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882408.45215: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882408.45235: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882408.45293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882408.45313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882408.45330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882408.45368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882408.45389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882408.45535: variable 'ansible_distribution' from source: facts 13040 1726882408.45559: variable 'ansible_distribution_major_version' from source: facts 13040 1726882408.45613: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882408.45621: when evaluation is False, skipping this task 13040 1726882408.45627: _execute() done 13040 1726882408.45633: dumping result to json 13040 1726882408.45640: done dumping result, returning 13040 1726882408.45655: done running TaskExecutor() for managed_node1/TASK: From the active connection, get the port1 profile "bond0.0" [0e448fcc-3ce9-b123-314b-0000000000c0] 13040 1726882408.45670: sending task result for task 0e448fcc-3ce9-b123-314b-0000000000c0 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882408.45832: no more pending results, returning what we have 13040 1726882408.45837: results queue empty 13040 1726882408.45838: checking for any_errors_fatal 13040 1726882408.45840: done checking for any_errors_fatal 13040 1726882408.45840: checking for max_fail_percentage 13040 1726882408.45842: done checking for max_fail_percentage 13040 1726882408.45843: checking to see if all hosts have failed and the running result is not ok 13040 1726882408.45844: done checking to see if all hosts have failed 13040 1726882408.45844: getting the remaining hosts for this loop 13040 1726882408.45846: done getting the remaining hosts for this loop 13040 1726882408.45849: getting the next task for host managed_node1 13040 1726882408.45858: done getting next task for host managed_node1 13040 1726882408.45862: ^ task is: TASK: From the active connection, get the port2 profile "{{ port2_profile }}" 13040 1726882408.45867: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882408.45871: getting variables 13040 1726882408.45872: in VariableManager get_vars() 13040 1726882408.45930: Calling all_inventory to load vars for managed_node1 13040 1726882408.45933: Calling groups_inventory to load vars for managed_node1 13040 1726882408.45935: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882408.45947: Calling all_plugins_play to load vars for managed_node1 13040 1726882408.45950: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882408.45956: Calling groups_plugins_play to load vars for managed_node1 13040 1726882408.46133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882408.46523: done with get_vars() 13040 1726882408.46532: done getting variables 13040 1726882408.46567: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000000c0 13040 1726882408.46573: WORKER PROCESS EXITING 13040 1726882408.46627: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13040 1726882408.46748: variable 'port2_profile' from source: play vars TASK [From the active connection, get the port2 profile "bond0.1"] ************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:111 Friday 20 September 2024 21:33:28 -0400 (0:00:00.054) 0:00:05.944 ****** 13040 1726882408.46778: entering _queue_task() for managed_node1/command 13040 1726882408.46990: worker is 1 (out of 1 available) 13040 1726882408.47002: exiting _queue_task() for managed_node1/command 13040 1726882408.47013: done queuing things up, now waiting for results queue to drain 13040 1726882408.47014: waiting for pending results... 13040 1726882408.47180: running TaskExecutor() for managed_node1/TASK: From the active connection, get the port2 profile "bond0.1" 13040 1726882408.47238: in run() - task 0e448fcc-3ce9-b123-314b-0000000000c1 13040 1726882408.47248: variable 'ansible_search_path' from source: unknown 13040 1726882408.47280: calling self._execute() 13040 1726882408.47341: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882408.47344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882408.47356: variable 'omit' from source: magic vars 13040 1726882408.47651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882408.50169: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882408.50216: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882408.50251: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882408.50309: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882408.50329: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882408.50390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882408.50410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882408.50439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882408.50471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882408.50484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882408.50577: variable 'ansible_distribution' from source: facts 13040 1726882408.50583: variable 'ansible_distribution_major_version' from source: facts 13040 1726882408.50597: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882408.50599: when evaluation is False, skipping this task 13040 1726882408.50602: _execute() done 13040 1726882408.50604: dumping result to json 13040 1726882408.50606: done dumping result, returning 13040 1726882408.50613: done running TaskExecutor() for managed_node1/TASK: From the active connection, get the port2 profile "bond0.1" [0e448fcc-3ce9-b123-314b-0000000000c1] 13040 1726882408.50618: sending task result for task 0e448fcc-3ce9-b123-314b-0000000000c1 13040 1726882408.50707: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000000c1 13040 1726882408.50710: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882408.50757: no more pending results, returning what we have 13040 1726882408.50760: results queue empty 13040 1726882408.50761: checking for any_errors_fatal 13040 1726882408.50769: done checking for any_errors_fatal 13040 1726882408.50770: checking for max_fail_percentage 13040 1726882408.50772: done checking for max_fail_percentage 13040 1726882408.50773: checking to see if all hosts have failed and the running result is not ok 13040 1726882408.50774: done checking to see if all hosts have failed 13040 1726882408.50774: getting the remaining hosts for this loop 13040 1726882408.50775: done getting the remaining hosts for this loop 13040 1726882408.50779: getting the next task for host managed_node1 13040 1726882408.50784: done getting next task for host managed_node1 13040 1726882408.50787: ^ task is: TASK: Assert that the port1 profile is not activated 13040 1726882408.50789: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882408.50792: getting variables 13040 1726882408.50793: in VariableManager get_vars() 13040 1726882408.50843: Calling all_inventory to load vars for managed_node1 13040 1726882408.50845: Calling groups_inventory to load vars for managed_node1 13040 1726882408.50848: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882408.50861: Calling all_plugins_play to load vars for managed_node1 13040 1726882408.50865: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882408.50868: Calling groups_plugins_play to load vars for managed_node1 13040 1726882408.50992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882408.51112: done with get_vars() 13040 1726882408.51120: done getting variables 13040 1726882408.51166: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the port1 profile is not activated] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:118 Friday 20 September 2024 21:33:28 -0400 (0:00:00.044) 0:00:05.989 ****** 13040 1726882408.51186: entering _queue_task() for managed_node1/assert 13040 1726882408.51379: worker is 1 (out of 1 available) 13040 1726882408.51393: exiting _queue_task() for managed_node1/assert 13040 1726882408.51405: done queuing things up, now waiting for results queue to drain 13040 1726882408.51406: waiting for pending results... 13040 1726882408.51569: running TaskExecutor() for managed_node1/TASK: Assert that the port1 profile is not activated 13040 1726882408.51631: in run() - task 0e448fcc-3ce9-b123-314b-0000000000c2 13040 1726882408.51642: variable 'ansible_search_path' from source: unknown 13040 1726882408.51675: calling self._execute() 13040 1726882408.51735: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882408.51739: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882408.51748: variable 'omit' from source: magic vars 13040 1726882408.52070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882408.54522: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882408.54668: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882408.54735: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882408.54774: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882408.54793: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882408.54869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882408.54889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882408.54906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882408.54938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882408.54948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882408.55045: variable 'ansible_distribution' from source: facts 13040 1726882408.55050: variable 'ansible_distribution_major_version' from source: facts 13040 1726882408.55067: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882408.55070: when evaluation is False, skipping this task 13040 1726882408.55072: _execute() done 13040 1726882408.55075: dumping result to json 13040 1726882408.55077: done dumping result, returning 13040 1726882408.55084: done running TaskExecutor() for managed_node1/TASK: Assert that the port1 profile is not activated [0e448fcc-3ce9-b123-314b-0000000000c2] 13040 1726882408.55089: sending task result for task 0e448fcc-3ce9-b123-314b-0000000000c2 13040 1726882408.55178: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000000c2 13040 1726882408.55181: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882408.55221: no more pending results, returning what we have 13040 1726882408.55224: results queue empty 13040 1726882408.55225: checking for any_errors_fatal 13040 1726882408.55231: done checking for any_errors_fatal 13040 1726882408.55232: checking for max_fail_percentage 13040 1726882408.55234: done checking for max_fail_percentage 13040 1726882408.55234: checking to see if all hosts have failed and the running result is not ok 13040 1726882408.55235: done checking to see if all hosts have failed 13040 1726882408.55236: getting the remaining hosts for this loop 13040 1726882408.55237: done getting the remaining hosts for this loop 13040 1726882408.55240: getting the next task for host managed_node1 13040 1726882408.55245: done getting next task for host managed_node1 13040 1726882408.55247: ^ task is: TASK: Assert that the port2 profile is not activated 13040 1726882408.55249: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882408.55255: getting variables 13040 1726882408.55257: in VariableManager get_vars() 13040 1726882408.55309: Calling all_inventory to load vars for managed_node1 13040 1726882408.55312: Calling groups_inventory to load vars for managed_node1 13040 1726882408.55314: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882408.55323: Calling all_plugins_play to load vars for managed_node1 13040 1726882408.55325: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882408.55328: Calling groups_plugins_play to load vars for managed_node1 13040 1726882408.55492: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882408.55608: done with get_vars() 13040 1726882408.55615: done getting variables 13040 1726882408.55655: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the port2 profile is not activated] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:125 Friday 20 September 2024 21:33:28 -0400 (0:00:00.044) 0:00:06.033 ****** 13040 1726882408.55676: entering _queue_task() for managed_node1/assert 13040 1726882408.55852: worker is 1 (out of 1 available) 13040 1726882408.55867: exiting _queue_task() for managed_node1/assert 13040 1726882408.55879: done queuing things up, now waiting for results queue to drain 13040 1726882408.55881: waiting for pending results... 13040 1726882408.56052: running TaskExecutor() for managed_node1/TASK: Assert that the port2 profile is not activated 13040 1726882408.56115: in run() - task 0e448fcc-3ce9-b123-314b-0000000000c3 13040 1726882408.56127: variable 'ansible_search_path' from source: unknown 13040 1726882408.56160: calling self._execute() 13040 1726882408.56224: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882408.56227: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882408.56237: variable 'omit' from source: magic vars 13040 1726882408.56543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882408.58710: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882408.58756: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882408.58793: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882408.58818: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882408.58838: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882408.58896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882408.58916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882408.58934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882408.58963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882408.58976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882408.59069: variable 'ansible_distribution' from source: facts 13040 1726882408.59074: variable 'ansible_distribution_major_version' from source: facts 13040 1726882408.59089: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882408.59092: when evaluation is False, skipping this task 13040 1726882408.59094: _execute() done 13040 1726882408.59096: dumping result to json 13040 1726882408.59100: done dumping result, returning 13040 1726882408.59107: done running TaskExecutor() for managed_node1/TASK: Assert that the port2 profile is not activated [0e448fcc-3ce9-b123-314b-0000000000c3] 13040 1726882408.59112: sending task result for task 0e448fcc-3ce9-b123-314b-0000000000c3 13040 1726882408.59201: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000000c3 13040 1726882408.59204: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882408.59267: no more pending results, returning what we have 13040 1726882408.59271: results queue empty 13040 1726882408.59272: checking for any_errors_fatal 13040 1726882408.59278: done checking for any_errors_fatal 13040 1726882408.59279: checking for max_fail_percentage 13040 1726882408.59281: done checking for max_fail_percentage 13040 1726882408.59282: checking to see if all hosts have failed and the running result is not ok 13040 1726882408.59283: done checking to see if all hosts have failed 13040 1726882408.59283: getting the remaining hosts for this loop 13040 1726882408.59284: done getting the remaining hosts for this loop 13040 1726882408.59288: getting the next task for host managed_node1 13040 1726882408.59293: done getting next task for host managed_node1 13040 1726882408.59295: ^ task is: TASK: Get the port1 device state 13040 1726882408.59297: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882408.59300: getting variables 13040 1726882408.59302: in VariableManager get_vars() 13040 1726882408.59351: Calling all_inventory to load vars for managed_node1 13040 1726882408.59356: Calling groups_inventory to load vars for managed_node1 13040 1726882408.59358: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882408.59373: Calling all_plugins_play to load vars for managed_node1 13040 1726882408.59376: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882408.59379: Calling groups_plugins_play to load vars for managed_node1 13040 1726882408.59495: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882408.59615: done with get_vars() 13040 1726882408.59622: done getting variables 13040 1726882408.59666: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the port1 device state] ********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:132 Friday 20 September 2024 21:33:28 -0400 (0:00:00.040) 0:00:06.074 ****** 13040 1726882408.59686: entering _queue_task() for managed_node1/command 13040 1726882408.59878: worker is 1 (out of 1 available) 13040 1726882408.59892: exiting _queue_task() for managed_node1/command 13040 1726882408.59904: done queuing things up, now waiting for results queue to drain 13040 1726882408.59905: waiting for pending results... 13040 1726882408.60072: running TaskExecutor() for managed_node1/TASK: Get the port1 device state 13040 1726882408.60128: in run() - task 0e448fcc-3ce9-b123-314b-0000000000c4 13040 1726882408.60140: variable 'ansible_search_path' from source: unknown 13040 1726882408.60172: calling self._execute() 13040 1726882408.60233: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882408.60237: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882408.60244: variable 'omit' from source: magic vars 13040 1726882408.60557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882408.62342: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882408.62388: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882408.62416: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882408.62443: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882408.62467: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882408.62519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882408.62541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882408.62564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882408.62591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882408.62601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882408.62696: variable 'ansible_distribution' from source: facts 13040 1726882408.62700: variable 'ansible_distribution_major_version' from source: facts 13040 1726882408.62714: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882408.62717: when evaluation is False, skipping this task 13040 1726882408.62720: _execute() done 13040 1726882408.62722: dumping result to json 13040 1726882408.62724: done dumping result, returning 13040 1726882408.62730: done running TaskExecutor() for managed_node1/TASK: Get the port1 device state [0e448fcc-3ce9-b123-314b-0000000000c4] 13040 1726882408.62736: sending task result for task 0e448fcc-3ce9-b123-314b-0000000000c4 13040 1726882408.62822: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000000c4 13040 1726882408.62824: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882408.62877: no more pending results, returning what we have 13040 1726882408.62881: results queue empty 13040 1726882408.62882: checking for any_errors_fatal 13040 1726882408.62890: done checking for any_errors_fatal 13040 1726882408.62891: checking for max_fail_percentage 13040 1726882408.62892: done checking for max_fail_percentage 13040 1726882408.62893: checking to see if all hosts have failed and the running result is not ok 13040 1726882408.62894: done checking to see if all hosts have failed 13040 1726882408.62895: getting the remaining hosts for this loop 13040 1726882408.62896: done getting the remaining hosts for this loop 13040 1726882408.62899: getting the next task for host managed_node1 13040 1726882408.62905: done getting next task for host managed_node1 13040 1726882408.62908: ^ task is: TASK: Get the port2 device state 13040 1726882408.62909: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882408.62912: getting variables 13040 1726882408.62914: in VariableManager get_vars() 13040 1726882408.62969: Calling all_inventory to load vars for managed_node1 13040 1726882408.62973: Calling groups_inventory to load vars for managed_node1 13040 1726882408.62975: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882408.62985: Calling all_plugins_play to load vars for managed_node1 13040 1726882408.62987: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882408.62989: Calling groups_plugins_play to load vars for managed_node1 13040 1726882408.63110: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882408.63416: done with get_vars() 13040 1726882408.63423: done getting variables 13040 1726882408.63463: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the port2 device state] ********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:139 Friday 20 September 2024 21:33:28 -0400 (0:00:00.037) 0:00:06.112 ****** 13040 1726882408.63482: entering _queue_task() for managed_node1/command 13040 1726882408.63671: worker is 1 (out of 1 available) 13040 1726882408.63683: exiting _queue_task() for managed_node1/command 13040 1726882408.63695: done queuing things up, now waiting for results queue to drain 13040 1726882408.63696: waiting for pending results... 13040 1726882408.63859: running TaskExecutor() for managed_node1/TASK: Get the port2 device state 13040 1726882408.63923: in run() - task 0e448fcc-3ce9-b123-314b-0000000000c5 13040 1726882408.63939: variable 'ansible_search_path' from source: unknown 13040 1726882408.63974: calling self._execute() 13040 1726882408.64037: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882408.64041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882408.64053: variable 'omit' from source: magic vars 13040 1726882408.64359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882408.65939: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882408.65994: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882408.66021: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882408.66046: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882408.66067: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882408.66124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882408.66144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882408.66163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882408.66190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882408.66202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882408.66300: variable 'ansible_distribution' from source: facts 13040 1726882408.66303: variable 'ansible_distribution_major_version' from source: facts 13040 1726882408.66319: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882408.66324: when evaluation is False, skipping this task 13040 1726882408.66326: _execute() done 13040 1726882408.66328: dumping result to json 13040 1726882408.66330: done dumping result, returning 13040 1726882408.66336: done running TaskExecutor() for managed_node1/TASK: Get the port2 device state [0e448fcc-3ce9-b123-314b-0000000000c5] 13040 1726882408.66346: sending task result for task 0e448fcc-3ce9-b123-314b-0000000000c5 13040 1726882408.66429: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000000c5 13040 1726882408.66432: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882408.66480: no more pending results, returning what we have 13040 1726882408.66484: results queue empty 13040 1726882408.66485: checking for any_errors_fatal 13040 1726882408.66492: done checking for any_errors_fatal 13040 1726882408.66493: checking for max_fail_percentage 13040 1726882408.66495: done checking for max_fail_percentage 13040 1726882408.66495: checking to see if all hosts have failed and the running result is not ok 13040 1726882408.66496: done checking to see if all hosts have failed 13040 1726882408.66497: getting the remaining hosts for this loop 13040 1726882408.66498: done getting the remaining hosts for this loop 13040 1726882408.66502: getting the next task for host managed_node1 13040 1726882408.66507: done getting next task for host managed_node1 13040 1726882408.66510: ^ task is: TASK: Assert that the port1 device is in DOWN state 13040 1726882408.66512: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882408.66514: getting variables 13040 1726882408.66516: in VariableManager get_vars() 13040 1726882408.66569: Calling all_inventory to load vars for managed_node1 13040 1726882408.66572: Calling groups_inventory to load vars for managed_node1 13040 1726882408.66574: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882408.66583: Calling all_plugins_play to load vars for managed_node1 13040 1726882408.66586: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882408.66588: Calling groups_plugins_play to load vars for managed_node1 13040 1726882408.66716: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882408.66835: done with get_vars() 13040 1726882408.66844: done getting variables 13040 1726882408.66887: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the port1 device is in DOWN state] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:146 Friday 20 September 2024 21:33:28 -0400 (0:00:00.034) 0:00:06.146 ****** 13040 1726882408.66906: entering _queue_task() for managed_node1/assert 13040 1726882408.67086: worker is 1 (out of 1 available) 13040 1726882408.67097: exiting _queue_task() for managed_node1/assert 13040 1726882408.67108: done queuing things up, now waiting for results queue to drain 13040 1726882408.67110: waiting for pending results... 13040 1726882408.67287: running TaskExecutor() for managed_node1/TASK: Assert that the port1 device is in DOWN state 13040 1726882408.67348: in run() - task 0e448fcc-3ce9-b123-314b-0000000000c6 13040 1726882408.67362: variable 'ansible_search_path' from source: unknown 13040 1726882408.67390: calling self._execute() 13040 1726882408.67462: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882408.67467: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882408.67477: variable 'omit' from source: magic vars 13040 1726882408.67778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882408.69424: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882408.69473: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882408.69500: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882408.69527: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882408.69547: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882408.69605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882408.69632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882408.69649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882408.69681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882408.69692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882408.69793: variable 'ansible_distribution' from source: facts 13040 1726882408.69796: variable 'ansible_distribution_major_version' from source: facts 13040 1726882408.69812: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882408.69815: when evaluation is False, skipping this task 13040 1726882408.69819: _execute() done 13040 1726882408.69822: dumping result to json 13040 1726882408.69824: done dumping result, returning 13040 1726882408.69828: done running TaskExecutor() for managed_node1/TASK: Assert that the port1 device is in DOWN state [0e448fcc-3ce9-b123-314b-0000000000c6] 13040 1726882408.69839: sending task result for task 0e448fcc-3ce9-b123-314b-0000000000c6 13040 1726882408.69925: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000000c6 13040 1726882408.69927: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882408.69995: no more pending results, returning what we have 13040 1726882408.69999: results queue empty 13040 1726882408.70000: checking for any_errors_fatal 13040 1726882408.70005: done checking for any_errors_fatal 13040 1726882408.70006: checking for max_fail_percentage 13040 1726882408.70007: done checking for max_fail_percentage 13040 1726882408.70008: checking to see if all hosts have failed and the running result is not ok 13040 1726882408.70009: done checking to see if all hosts have failed 13040 1726882408.70010: getting the remaining hosts for this loop 13040 1726882408.70011: done getting the remaining hosts for this loop 13040 1726882408.70014: getting the next task for host managed_node1 13040 1726882408.70020: done getting next task for host managed_node1 13040 1726882408.70023: ^ task is: TASK: Assert that the port2 device is in DOWN state 13040 1726882408.70025: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882408.70028: getting variables 13040 1726882408.70029: in VariableManager get_vars() 13040 1726882408.70090: Calling all_inventory to load vars for managed_node1 13040 1726882408.70093: Calling groups_inventory to load vars for managed_node1 13040 1726882408.70095: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882408.70104: Calling all_plugins_play to load vars for managed_node1 13040 1726882408.70106: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882408.70109: Calling groups_plugins_play to load vars for managed_node1 13040 1726882408.70275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882408.70394: done with get_vars() 13040 1726882408.70403: done getting variables 13040 1726882408.70442: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the port2 device is in DOWN state] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:153 Friday 20 September 2024 21:33:28 -0400 (0:00:00.035) 0:00:06.181 ****** 13040 1726882408.70461: entering _queue_task() for managed_node1/assert 13040 1726882408.70647: worker is 1 (out of 1 available) 13040 1726882408.70659: exiting _queue_task() for managed_node1/assert 13040 1726882408.70674: done queuing things up, now waiting for results queue to drain 13040 1726882408.70675: waiting for pending results... 13040 1726882408.70847: running TaskExecutor() for managed_node1/TASK: Assert that the port2 device is in DOWN state 13040 1726882408.70912: in run() - task 0e448fcc-3ce9-b123-314b-0000000000c7 13040 1726882408.70923: variable 'ansible_search_path' from source: unknown 13040 1726882408.70957: calling self._execute() 13040 1726882408.71020: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882408.71024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882408.71037: variable 'omit' from source: magic vars 13040 1726882408.71368: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882408.73496: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882408.73569: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882408.73597: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882408.73621: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882408.73641: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882408.73705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882408.73732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882408.73749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882408.73785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882408.73796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882408.73895: variable 'ansible_distribution' from source: facts 13040 1726882408.73899: variable 'ansible_distribution_major_version' from source: facts 13040 1726882408.73914: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882408.73917: when evaluation is False, skipping this task 13040 1726882408.73919: _execute() done 13040 1726882408.73922: dumping result to json 13040 1726882408.73924: done dumping result, returning 13040 1726882408.73930: done running TaskExecutor() for managed_node1/TASK: Assert that the port2 device is in DOWN state [0e448fcc-3ce9-b123-314b-0000000000c7] 13040 1726882408.73936: sending task result for task 0e448fcc-3ce9-b123-314b-0000000000c7 13040 1726882408.74021: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000000c7 13040 1726882408.74024: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882408.74073: no more pending results, returning what we have 13040 1726882408.74076: results queue empty 13040 1726882408.74077: checking for any_errors_fatal 13040 1726882408.74084: done checking for any_errors_fatal 13040 1726882408.74084: checking for max_fail_percentage 13040 1726882408.74086: done checking for max_fail_percentage 13040 1726882408.74087: checking to see if all hosts have failed and the running result is not ok 13040 1726882408.74088: done checking to see if all hosts have failed 13040 1726882408.74088: getting the remaining hosts for this loop 13040 1726882408.74090: done getting the remaining hosts for this loop 13040 1726882408.74094: getting the next task for host managed_node1 13040 1726882408.74102: done getting next task for host managed_node1 13040 1726882408.74108: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13040 1726882408.74110: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882408.74128: getting variables 13040 1726882408.74130: in VariableManager get_vars() 13040 1726882408.74186: Calling all_inventory to load vars for managed_node1 13040 1726882408.74189: Calling groups_inventory to load vars for managed_node1 13040 1726882408.74191: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882408.74199: Calling all_plugins_play to load vars for managed_node1 13040 1726882408.74201: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882408.74204: Calling groups_plugins_play to load vars for managed_node1 13040 1726882408.74334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882408.74471: done with get_vars() 13040 1726882408.74480: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:33:28 -0400 (0:00:00.040) 0:00:06.222 ****** 13040 1726882408.74545: entering _queue_task() for managed_node1/include_tasks 13040 1726882408.74737: worker is 1 (out of 1 available) 13040 1726882408.74749: exiting _queue_task() for managed_node1/include_tasks 13040 1726882408.74765: done queuing things up, now waiting for results queue to drain 13040 1726882408.74766: waiting for pending results... 13040 1726882408.74931: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13040 1726882408.75017: in run() - task 0e448fcc-3ce9-b123-314b-0000000000cf 13040 1726882408.75030: variable 'ansible_search_path' from source: unknown 13040 1726882408.75033: variable 'ansible_search_path' from source: unknown 13040 1726882408.75063: calling self._execute() 13040 1726882408.75127: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882408.75130: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882408.75139: variable 'omit' from source: magic vars 13040 1726882408.75558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882408.78173: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882408.78219: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882408.78245: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882408.78273: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882408.78293: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882408.78350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882408.78373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882408.78390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882408.78416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882408.78429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882408.78523: variable 'ansible_distribution' from source: facts 13040 1726882408.78528: variable 'ansible_distribution_major_version' from source: facts 13040 1726882408.78544: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882408.78548: when evaluation is False, skipping this task 13040 1726882408.78551: _execute() done 13040 1726882408.78556: dumping result to json 13040 1726882408.78558: done dumping result, returning 13040 1726882408.78561: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-b123-314b-0000000000cf] 13040 1726882408.78570: sending task result for task 0e448fcc-3ce9-b123-314b-0000000000cf 13040 1726882408.78657: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000000cf 13040 1726882408.78661: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882408.78717: no more pending results, returning what we have 13040 1726882408.78721: results queue empty 13040 1726882408.78721: checking for any_errors_fatal 13040 1726882408.78728: done checking for any_errors_fatal 13040 1726882408.78729: checking for max_fail_percentage 13040 1726882408.78730: done checking for max_fail_percentage 13040 1726882408.78731: checking to see if all hosts have failed and the running result is not ok 13040 1726882408.78732: done checking to see if all hosts have failed 13040 1726882408.78732: getting the remaining hosts for this loop 13040 1726882408.78734: done getting the remaining hosts for this loop 13040 1726882408.78738: getting the next task for host managed_node1 13040 1726882408.78744: done getting next task for host managed_node1 13040 1726882408.78747: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 13040 1726882408.78754: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882408.78773: getting variables 13040 1726882408.78775: in VariableManager get_vars() 13040 1726882408.78824: Calling all_inventory to load vars for managed_node1 13040 1726882408.78827: Calling groups_inventory to load vars for managed_node1 13040 1726882408.78829: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882408.78836: Calling all_plugins_play to load vars for managed_node1 13040 1726882408.78838: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882408.78841: Calling groups_plugins_play to load vars for managed_node1 13040 1726882408.78967: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882408.79150: done with get_vars() 13040 1726882408.79159: done getting variables 13040 1726882408.79227: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:33:28 -0400 (0:00:00.047) 0:00:06.269 ****** 13040 1726882408.79257: entering _queue_task() for managed_node1/debug 13040 1726882408.79508: worker is 1 (out of 1 available) 13040 1726882408.79525: exiting _queue_task() for managed_node1/debug 13040 1726882408.79536: done queuing things up, now waiting for results queue to drain 13040 1726882408.79538: waiting for pending results... 13040 1726882408.79816: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 13040 1726882408.79955: in run() - task 0e448fcc-3ce9-b123-314b-0000000000d0 13040 1726882408.79983: variable 'ansible_search_path' from source: unknown 13040 1726882408.79991: variable 'ansible_search_path' from source: unknown 13040 1726882408.80031: calling self._execute() 13040 1726882408.80124: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882408.80136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882408.80151: variable 'omit' from source: magic vars 13040 1726882408.80577: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882408.82212: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882408.82271: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882408.82300: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882408.82325: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882408.82345: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882408.82413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882408.82448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882408.82470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882408.82514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882408.82536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882408.82671: variable 'ansible_distribution' from source: facts 13040 1726882408.82684: variable 'ansible_distribution_major_version' from source: facts 13040 1726882408.82710: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882408.82718: when evaluation is False, skipping this task 13040 1726882408.82724: _execute() done 13040 1726882408.82729: dumping result to json 13040 1726882408.82735: done dumping result, returning 13040 1726882408.82747: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-b123-314b-0000000000d0] 13040 1726882408.82759: sending task result for task 0e448fcc-3ce9-b123-314b-0000000000d0 13040 1726882408.82868: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000000d0 13040 1726882408.82876: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 13040 1726882408.82925: no more pending results, returning what we have 13040 1726882408.82928: results queue empty 13040 1726882408.82929: checking for any_errors_fatal 13040 1726882408.82935: done checking for any_errors_fatal 13040 1726882408.82935: checking for max_fail_percentage 13040 1726882408.82937: done checking for max_fail_percentage 13040 1726882408.82938: checking to see if all hosts have failed and the running result is not ok 13040 1726882408.82939: done checking to see if all hosts have failed 13040 1726882408.82939: getting the remaining hosts for this loop 13040 1726882408.82940: done getting the remaining hosts for this loop 13040 1726882408.82944: getting the next task for host managed_node1 13040 1726882408.82949: done getting next task for host managed_node1 13040 1726882408.82955: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13040 1726882408.82958: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882408.82980: getting variables 13040 1726882408.82982: in VariableManager get_vars() 13040 1726882408.83032: Calling all_inventory to load vars for managed_node1 13040 1726882408.83034: Calling groups_inventory to load vars for managed_node1 13040 1726882408.83037: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882408.83046: Calling all_plugins_play to load vars for managed_node1 13040 1726882408.83048: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882408.83051: Calling groups_plugins_play to load vars for managed_node1 13040 1726882408.83231: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882408.83480: done with get_vars() 13040 1726882408.83491: done getting variables 13040 1726882408.83549: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:33:28 -0400 (0:00:00.043) 0:00:06.313 ****** 13040 1726882408.83588: entering _queue_task() for managed_node1/fail 13040 1726882408.83867: worker is 1 (out of 1 available) 13040 1726882408.83880: exiting _queue_task() for managed_node1/fail 13040 1726882408.83891: done queuing things up, now waiting for results queue to drain 13040 1726882408.83893: waiting for pending results... 13040 1726882408.84188: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13040 1726882408.84434: in run() - task 0e448fcc-3ce9-b123-314b-0000000000d1 13040 1726882408.84460: variable 'ansible_search_path' from source: unknown 13040 1726882408.84471: variable 'ansible_search_path' from source: unknown 13040 1726882408.84533: calling self._execute() 13040 1726882408.84615: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882408.84618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882408.84633: variable 'omit' from source: magic vars 13040 1726882408.84988: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882408.86735: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882408.86795: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882408.86835: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882408.86875: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882408.86904: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882408.86983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882408.87019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882408.87145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882408.87195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882408.87215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882408.87354: variable 'ansible_distribution' from source: facts 13040 1726882408.87369: variable 'ansible_distribution_major_version' from source: facts 13040 1726882408.87396: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882408.87403: when evaluation is False, skipping this task 13040 1726882408.87408: _execute() done 13040 1726882408.87414: dumping result to json 13040 1726882408.87420: done dumping result, returning 13040 1726882408.87431: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-b123-314b-0000000000d1] 13040 1726882408.87441: sending task result for task 0e448fcc-3ce9-b123-314b-0000000000d1 13040 1726882408.87550: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000000d1 13040 1726882408.87557: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882408.87781: no more pending results, returning what we have 13040 1726882408.87785: results queue empty 13040 1726882408.87785: checking for any_errors_fatal 13040 1726882408.87791: done checking for any_errors_fatal 13040 1726882408.87792: checking for max_fail_percentage 13040 1726882408.87794: done checking for max_fail_percentage 13040 1726882408.87795: checking to see if all hosts have failed and the running result is not ok 13040 1726882408.87796: done checking to see if all hosts have failed 13040 1726882408.87796: getting the remaining hosts for this loop 13040 1726882408.87798: done getting the remaining hosts for this loop 13040 1726882408.87801: getting the next task for host managed_node1 13040 1726882408.87807: done getting next task for host managed_node1 13040 1726882408.87811: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13040 1726882408.87813: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882408.87833: getting variables 13040 1726882408.87840: in VariableManager get_vars() 13040 1726882408.87896: Calling all_inventory to load vars for managed_node1 13040 1726882408.87900: Calling groups_inventory to load vars for managed_node1 13040 1726882408.87902: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882408.87912: Calling all_plugins_play to load vars for managed_node1 13040 1726882408.87915: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882408.87918: Calling groups_plugins_play to load vars for managed_node1 13040 1726882408.88166: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882408.88497: done with get_vars() 13040 1726882408.88528: done getting variables 13040 1726882408.88591: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:33:28 -0400 (0:00:00.050) 0:00:06.363 ****** 13040 1726882408.88655: entering _queue_task() for managed_node1/fail 13040 1726882408.88987: worker is 1 (out of 1 available) 13040 1726882408.88998: exiting _queue_task() for managed_node1/fail 13040 1726882408.89012: done queuing things up, now waiting for results queue to drain 13040 1726882408.89013: waiting for pending results... 13040 1726882408.89332: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13040 1726882408.89488: in run() - task 0e448fcc-3ce9-b123-314b-0000000000d2 13040 1726882408.89524: variable 'ansible_search_path' from source: unknown 13040 1726882408.89533: variable 'ansible_search_path' from source: unknown 13040 1726882408.89583: calling self._execute() 13040 1726882408.89683: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882408.89694: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882408.89716: variable 'omit' from source: magic vars 13040 1726882408.90223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882408.92601: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882408.92718: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882408.92767: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882408.92807: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882408.92840: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882408.92925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882408.92965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882408.92996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882408.93039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882408.93062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882408.93200: variable 'ansible_distribution' from source: facts 13040 1726882408.93211: variable 'ansible_distribution_major_version' from source: facts 13040 1726882408.93231: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882408.93239: when evaluation is False, skipping this task 13040 1726882408.93245: _execute() done 13040 1726882408.93250: dumping result to json 13040 1726882408.93260: done dumping result, returning 13040 1726882408.93275: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-b123-314b-0000000000d2] 13040 1726882408.93284: sending task result for task 0e448fcc-3ce9-b123-314b-0000000000d2 13040 1726882408.93414: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000000d2 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882408.93462: no more pending results, returning what we have 13040 1726882408.93469: results queue empty 13040 1726882408.93469: checking for any_errors_fatal 13040 1726882408.93475: done checking for any_errors_fatal 13040 1726882408.93475: checking for max_fail_percentage 13040 1726882408.93477: done checking for max_fail_percentage 13040 1726882408.93478: checking to see if all hosts have failed and the running result is not ok 13040 1726882408.93479: done checking to see if all hosts have failed 13040 1726882408.93479: getting the remaining hosts for this loop 13040 1726882408.93480: done getting the remaining hosts for this loop 13040 1726882408.93484: getting the next task for host managed_node1 13040 1726882408.93490: done getting next task for host managed_node1 13040 1726882408.93494: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13040 1726882408.93496: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882408.93512: WORKER PROCESS EXITING 13040 1726882408.93524: getting variables 13040 1726882408.93526: in VariableManager get_vars() 13040 1726882408.93593: Calling all_inventory to load vars for managed_node1 13040 1726882408.93596: Calling groups_inventory to load vars for managed_node1 13040 1726882408.93599: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882408.93610: Calling all_plugins_play to load vars for managed_node1 13040 1726882408.93612: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882408.93615: Calling groups_plugins_play to load vars for managed_node1 13040 1726882408.93792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882408.94018: done with get_vars() 13040 1726882408.94028: done getting variables 13040 1726882408.94078: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:33:28 -0400 (0:00:00.054) 0:00:06.418 ****** 13040 1726882408.94101: entering _queue_task() for managed_node1/fail 13040 1726882408.94297: worker is 1 (out of 1 available) 13040 1726882408.94308: exiting _queue_task() for managed_node1/fail 13040 1726882408.94320: done queuing things up, now waiting for results queue to drain 13040 1726882408.94322: waiting for pending results... 13040 1726882408.94503: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13040 1726882408.94592: in run() - task 0e448fcc-3ce9-b123-314b-0000000000d3 13040 1726882408.94604: variable 'ansible_search_path' from source: unknown 13040 1726882408.94607: variable 'ansible_search_path' from source: unknown 13040 1726882408.94638: calling self._execute() 13040 1726882408.94704: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882408.94707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882408.94715: variable 'omit' from source: magic vars 13040 1726882408.95021: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882408.96692: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882408.96738: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882408.96769: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882408.96796: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882408.96816: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882408.96875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882408.96901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882408.96915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882408.96943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882408.96953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882408.97056: variable 'ansible_distribution' from source: facts 13040 1726882408.97065: variable 'ansible_distribution_major_version' from source: facts 13040 1726882408.97082: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882408.97085: when evaluation is False, skipping this task 13040 1726882408.97087: _execute() done 13040 1726882408.97090: dumping result to json 13040 1726882408.97092: done dumping result, returning 13040 1726882408.97101: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-b123-314b-0000000000d3] 13040 1726882408.97106: sending task result for task 0e448fcc-3ce9-b123-314b-0000000000d3 13040 1726882408.97198: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000000d3 13040 1726882408.97200: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882408.97248: no more pending results, returning what we have 13040 1726882408.97252: results queue empty 13040 1726882408.97253: checking for any_errors_fatal 13040 1726882408.97258: done checking for any_errors_fatal 13040 1726882408.97259: checking for max_fail_percentage 13040 1726882408.97261: done checking for max_fail_percentage 13040 1726882408.97262: checking to see if all hosts have failed and the running result is not ok 13040 1726882408.97262: done checking to see if all hosts have failed 13040 1726882408.97264: getting the remaining hosts for this loop 13040 1726882408.97266: done getting the remaining hosts for this loop 13040 1726882408.97269: getting the next task for host managed_node1 13040 1726882408.97276: done getting next task for host managed_node1 13040 1726882408.97280: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13040 1726882408.97283: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882408.97300: getting variables 13040 1726882408.97302: in VariableManager get_vars() 13040 1726882408.97357: Calling all_inventory to load vars for managed_node1 13040 1726882408.97360: Calling groups_inventory to load vars for managed_node1 13040 1726882408.97362: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882408.97372: Calling all_plugins_play to load vars for managed_node1 13040 1726882408.97374: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882408.97377: Calling groups_plugins_play to load vars for managed_node1 13040 1726882408.97538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882408.97668: done with get_vars() 13040 1726882408.97676: done getting variables 13040 1726882408.97717: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:33:28 -0400 (0:00:00.036) 0:00:06.454 ****** 13040 1726882408.97738: entering _queue_task() for managed_node1/dnf 13040 1726882408.97939: worker is 1 (out of 1 available) 13040 1726882408.97952: exiting _queue_task() for managed_node1/dnf 13040 1726882408.97965: done queuing things up, now waiting for results queue to drain 13040 1726882408.97966: waiting for pending results... 13040 1726882408.98145: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13040 1726882408.98236: in run() - task 0e448fcc-3ce9-b123-314b-0000000000d4 13040 1726882408.98249: variable 'ansible_search_path' from source: unknown 13040 1726882408.98255: variable 'ansible_search_path' from source: unknown 13040 1726882408.98287: calling self._execute() 13040 1726882408.98358: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882408.98362: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882408.98371: variable 'omit' from source: magic vars 13040 1726882408.98690: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882409.00325: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882409.00386: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882409.00414: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882409.00438: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882409.00461: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882409.00521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882409.00539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882409.00560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882409.00589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882409.00605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882409.00712: variable 'ansible_distribution' from source: facts 13040 1726882409.00717: variable 'ansible_distribution_major_version' from source: facts 13040 1726882409.00732: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882409.00735: when evaluation is False, skipping this task 13040 1726882409.00738: _execute() done 13040 1726882409.00740: dumping result to json 13040 1726882409.00742: done dumping result, returning 13040 1726882409.00750: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-b123-314b-0000000000d4] 13040 1726882409.00756: sending task result for task 0e448fcc-3ce9-b123-314b-0000000000d4 13040 1726882409.00846: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000000d4 13040 1726882409.00849: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882409.00902: no more pending results, returning what we have 13040 1726882409.00905: results queue empty 13040 1726882409.00906: checking for any_errors_fatal 13040 1726882409.00913: done checking for any_errors_fatal 13040 1726882409.00913: checking for max_fail_percentage 13040 1726882409.00915: done checking for max_fail_percentage 13040 1726882409.00916: checking to see if all hosts have failed and the running result is not ok 13040 1726882409.00917: done checking to see if all hosts have failed 13040 1726882409.00917: getting the remaining hosts for this loop 13040 1726882409.00919: done getting the remaining hosts for this loop 13040 1726882409.00922: getting the next task for host managed_node1 13040 1726882409.00928: done getting next task for host managed_node1 13040 1726882409.00932: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13040 1726882409.00934: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882409.00954: getting variables 13040 1726882409.00956: in VariableManager get_vars() 13040 1726882409.01016: Calling all_inventory to load vars for managed_node1 13040 1726882409.01019: Calling groups_inventory to load vars for managed_node1 13040 1726882409.01021: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882409.01030: Calling all_plugins_play to load vars for managed_node1 13040 1726882409.01032: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882409.01034: Calling groups_plugins_play to load vars for managed_node1 13040 1726882409.01168: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882409.01354: done with get_vars() 13040 1726882409.01362: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13040 1726882409.01432: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:33:29 -0400 (0:00:00.037) 0:00:06.491 ****** 13040 1726882409.01458: entering _queue_task() for managed_node1/yum 13040 1726882409.01657: worker is 1 (out of 1 available) 13040 1726882409.01671: exiting _queue_task() for managed_node1/yum 13040 1726882409.01683: done queuing things up, now waiting for results queue to drain 13040 1726882409.01684: waiting for pending results... 13040 1726882409.01850: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13040 1726882409.01942: in run() - task 0e448fcc-3ce9-b123-314b-0000000000d5 13040 1726882409.01955: variable 'ansible_search_path' from source: unknown 13040 1726882409.01959: variable 'ansible_search_path' from source: unknown 13040 1726882409.01989: calling self._execute() 13040 1726882409.02054: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882409.02058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882409.02072: variable 'omit' from source: magic vars 13040 1726882409.02372: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882409.03995: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882409.04039: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882409.04070: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882409.04095: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882409.04115: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882409.04173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882409.04194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882409.04212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882409.04238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882409.04248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882409.04347: variable 'ansible_distribution' from source: facts 13040 1726882409.04354: variable 'ansible_distribution_major_version' from source: facts 13040 1726882409.04368: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882409.04371: when evaluation is False, skipping this task 13040 1726882409.04373: _execute() done 13040 1726882409.04377: dumping result to json 13040 1726882409.04380: done dumping result, returning 13040 1726882409.04391: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-b123-314b-0000000000d5] 13040 1726882409.04394: sending task result for task 0e448fcc-3ce9-b123-314b-0000000000d5 13040 1726882409.04484: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000000d5 13040 1726882409.04487: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882409.04548: no more pending results, returning what we have 13040 1726882409.04551: results queue empty 13040 1726882409.04554: checking for any_errors_fatal 13040 1726882409.04560: done checking for any_errors_fatal 13040 1726882409.04561: checking for max_fail_percentage 13040 1726882409.04563: done checking for max_fail_percentage 13040 1726882409.04565: checking to see if all hosts have failed and the running result is not ok 13040 1726882409.04566: done checking to see if all hosts have failed 13040 1726882409.04567: getting the remaining hosts for this loop 13040 1726882409.04568: done getting the remaining hosts for this loop 13040 1726882409.04572: getting the next task for host managed_node1 13040 1726882409.04578: done getting next task for host managed_node1 13040 1726882409.04582: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13040 1726882409.04584: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882409.04607: getting variables 13040 1726882409.04609: in VariableManager get_vars() 13040 1726882409.04658: Calling all_inventory to load vars for managed_node1 13040 1726882409.04662: Calling groups_inventory to load vars for managed_node1 13040 1726882409.04666: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882409.04674: Calling all_plugins_play to load vars for managed_node1 13040 1726882409.04676: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882409.04678: Calling groups_plugins_play to load vars for managed_node1 13040 1726882409.04833: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882409.04959: done with get_vars() 13040 1726882409.04969: done getting variables 13040 1726882409.05008: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:33:29 -0400 (0:00:00.035) 0:00:06.527 ****** 13040 1726882409.05029: entering _queue_task() for managed_node1/fail 13040 1726882409.05222: worker is 1 (out of 1 available) 13040 1726882409.05234: exiting _queue_task() for managed_node1/fail 13040 1726882409.05245: done queuing things up, now waiting for results queue to drain 13040 1726882409.05246: waiting for pending results... 13040 1726882409.05417: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13040 1726882409.05504: in run() - task 0e448fcc-3ce9-b123-314b-0000000000d6 13040 1726882409.05514: variable 'ansible_search_path' from source: unknown 13040 1726882409.05517: variable 'ansible_search_path' from source: unknown 13040 1726882409.05545: calling self._execute() 13040 1726882409.05616: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882409.05619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882409.05628: variable 'omit' from source: magic vars 13040 1726882409.05937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882409.07566: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882409.07620: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882409.07647: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882409.07680: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882409.07700: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882409.07757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882409.07786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882409.07804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882409.07829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882409.07840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882409.07938: variable 'ansible_distribution' from source: facts 13040 1726882409.07942: variable 'ansible_distribution_major_version' from source: facts 13040 1726882409.07957: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882409.07960: when evaluation is False, skipping this task 13040 1726882409.07963: _execute() done 13040 1726882409.07967: dumping result to json 13040 1726882409.07971: done dumping result, returning 13040 1726882409.07984: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-b123-314b-0000000000d6] 13040 1726882409.07989: sending task result for task 0e448fcc-3ce9-b123-314b-0000000000d6 13040 1726882409.08075: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000000d6 13040 1726882409.08078: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882409.08131: no more pending results, returning what we have 13040 1726882409.08134: results queue empty 13040 1726882409.08135: checking for any_errors_fatal 13040 1726882409.08141: done checking for any_errors_fatal 13040 1726882409.08142: checking for max_fail_percentage 13040 1726882409.08143: done checking for max_fail_percentage 13040 1726882409.08144: checking to see if all hosts have failed and the running result is not ok 13040 1726882409.08145: done checking to see if all hosts have failed 13040 1726882409.08145: getting the remaining hosts for this loop 13040 1726882409.08147: done getting the remaining hosts for this loop 13040 1726882409.08150: getting the next task for host managed_node1 13040 1726882409.08286: done getting next task for host managed_node1 13040 1726882409.08291: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 13040 1726882409.08298: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882409.08315: getting variables 13040 1726882409.08317: in VariableManager get_vars() 13040 1726882409.08368: Calling all_inventory to load vars for managed_node1 13040 1726882409.08371: Calling groups_inventory to load vars for managed_node1 13040 1726882409.08373: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882409.08382: Calling all_plugins_play to load vars for managed_node1 13040 1726882409.08385: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882409.08388: Calling groups_plugins_play to load vars for managed_node1 13040 1726882409.08574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882409.08801: done with get_vars() 13040 1726882409.08812: done getting variables 13040 1726882409.08878: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:33:29 -0400 (0:00:00.038) 0:00:06.566 ****** 13040 1726882409.08908: entering _queue_task() for managed_node1/package 13040 1726882409.09181: worker is 1 (out of 1 available) 13040 1726882409.09192: exiting _queue_task() for managed_node1/package 13040 1726882409.09204: done queuing things up, now waiting for results queue to drain 13040 1726882409.09206: waiting for pending results... 13040 1726882409.09489: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 13040 1726882409.09597: in run() - task 0e448fcc-3ce9-b123-314b-0000000000d7 13040 1726882409.09610: variable 'ansible_search_path' from source: unknown 13040 1726882409.09614: variable 'ansible_search_path' from source: unknown 13040 1726882409.09642: calling self._execute() 13040 1726882409.09721: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882409.09725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882409.09734: variable 'omit' from source: magic vars 13040 1726882409.10038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882409.11770: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882409.11837: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882409.11879: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882409.11918: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882409.11947: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882409.12021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882409.12053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882409.12086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882409.12130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882409.12149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882409.12280: variable 'ansible_distribution' from source: facts 13040 1726882409.12291: variable 'ansible_distribution_major_version' from source: facts 13040 1726882409.12313: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882409.12320: when evaluation is False, skipping this task 13040 1726882409.12328: _execute() done 13040 1726882409.12334: dumping result to json 13040 1726882409.12341: done dumping result, returning 13040 1726882409.12351: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-b123-314b-0000000000d7] 13040 1726882409.12361: sending task result for task 0e448fcc-3ce9-b123-314b-0000000000d7 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882409.12514: no more pending results, returning what we have 13040 1726882409.12518: results queue empty 13040 1726882409.12519: checking for any_errors_fatal 13040 1726882409.12524: done checking for any_errors_fatal 13040 1726882409.12525: checking for max_fail_percentage 13040 1726882409.12527: done checking for max_fail_percentage 13040 1726882409.12527: checking to see if all hosts have failed and the running result is not ok 13040 1726882409.12528: done checking to see if all hosts have failed 13040 1726882409.12529: getting the remaining hosts for this loop 13040 1726882409.12530: done getting the remaining hosts for this loop 13040 1726882409.12534: getting the next task for host managed_node1 13040 1726882409.12542: done getting next task for host managed_node1 13040 1726882409.12546: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13040 1726882409.12548: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882409.12569: getting variables 13040 1726882409.12571: in VariableManager get_vars() 13040 1726882409.12621: Calling all_inventory to load vars for managed_node1 13040 1726882409.12624: Calling groups_inventory to load vars for managed_node1 13040 1726882409.12626: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882409.12636: Calling all_plugins_play to load vars for managed_node1 13040 1726882409.12638: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882409.12641: Calling groups_plugins_play to load vars for managed_node1 13040 1726882409.12909: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000000d7 13040 1726882409.12912: WORKER PROCESS EXITING 13040 1726882409.12935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882409.13232: done with get_vars() 13040 1726882409.13243: done getting variables 13040 1726882409.13305: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:33:29 -0400 (0:00:00.044) 0:00:06.610 ****** 13040 1726882409.13346: entering _queue_task() for managed_node1/package 13040 1726882409.13631: worker is 1 (out of 1 available) 13040 1726882409.13650: exiting _queue_task() for managed_node1/package 13040 1726882409.13666: done queuing things up, now waiting for results queue to drain 13040 1726882409.13667: waiting for pending results... 13040 1726882409.14084: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13040 1726882409.14201: in run() - task 0e448fcc-3ce9-b123-314b-0000000000d8 13040 1726882409.14211: variable 'ansible_search_path' from source: unknown 13040 1726882409.14215: variable 'ansible_search_path' from source: unknown 13040 1726882409.14244: calling self._execute() 13040 1726882409.14320: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882409.14323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882409.14332: variable 'omit' from source: magic vars 13040 1726882409.14653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882409.16769: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882409.16886: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882409.16920: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882409.16945: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882409.16980: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882409.17044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882409.17069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882409.17086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882409.17112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882409.17126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882409.17228: variable 'ansible_distribution' from source: facts 13040 1726882409.17232: variable 'ansible_distribution_major_version' from source: facts 13040 1726882409.17250: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882409.17255: when evaluation is False, skipping this task 13040 1726882409.17258: _execute() done 13040 1726882409.17261: dumping result to json 13040 1726882409.17263: done dumping result, returning 13040 1726882409.17270: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-b123-314b-0000000000d8] 13040 1726882409.17275: sending task result for task 0e448fcc-3ce9-b123-314b-0000000000d8 13040 1726882409.17370: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000000d8 13040 1726882409.17373: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882409.17420: no more pending results, returning what we have 13040 1726882409.17423: results queue empty 13040 1726882409.17424: checking for any_errors_fatal 13040 1726882409.17432: done checking for any_errors_fatal 13040 1726882409.17433: checking for max_fail_percentage 13040 1726882409.17435: done checking for max_fail_percentage 13040 1726882409.17436: checking to see if all hosts have failed and the running result is not ok 13040 1726882409.17436: done checking to see if all hosts have failed 13040 1726882409.17437: getting the remaining hosts for this loop 13040 1726882409.17438: done getting the remaining hosts for this loop 13040 1726882409.17442: getting the next task for host managed_node1 13040 1726882409.17448: done getting next task for host managed_node1 13040 1726882409.17454: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13040 1726882409.17457: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882409.17477: getting variables 13040 1726882409.17479: in VariableManager get_vars() 13040 1726882409.17528: Calling all_inventory to load vars for managed_node1 13040 1726882409.17530: Calling groups_inventory to load vars for managed_node1 13040 1726882409.17532: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882409.17540: Calling all_plugins_play to load vars for managed_node1 13040 1726882409.17542: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882409.17545: Calling groups_plugins_play to load vars for managed_node1 13040 1726882409.17687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882409.17816: done with get_vars() 13040 1726882409.17825: done getting variables 13040 1726882409.17870: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:33:29 -0400 (0:00:00.045) 0:00:06.656 ****** 13040 1726882409.17894: entering _queue_task() for managed_node1/package 13040 1726882409.18087: worker is 1 (out of 1 available) 13040 1726882409.18100: exiting _queue_task() for managed_node1/package 13040 1726882409.18111: done queuing things up, now waiting for results queue to drain 13040 1726882409.18113: waiting for pending results... 13040 1726882409.18283: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13040 1726882409.18371: in run() - task 0e448fcc-3ce9-b123-314b-0000000000d9 13040 1726882409.18382: variable 'ansible_search_path' from source: unknown 13040 1726882409.18385: variable 'ansible_search_path' from source: unknown 13040 1726882409.18413: calling self._execute() 13040 1726882409.18480: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882409.18484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882409.18492: variable 'omit' from source: magic vars 13040 1726882409.18793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882409.21042: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882409.21091: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882409.21126: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882409.21167: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882409.21188: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882409.21248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882409.21273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882409.21290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882409.21321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882409.21331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882409.21430: variable 'ansible_distribution' from source: facts 13040 1726882409.21434: variable 'ansible_distribution_major_version' from source: facts 13040 1726882409.21448: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882409.21451: when evaluation is False, skipping this task 13040 1726882409.21456: _execute() done 13040 1726882409.21458: dumping result to json 13040 1726882409.21461: done dumping result, returning 13040 1726882409.21479: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-b123-314b-0000000000d9] 13040 1726882409.21484: sending task result for task 0e448fcc-3ce9-b123-314b-0000000000d9 13040 1726882409.21578: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000000d9 13040 1726882409.21580: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882409.21621: no more pending results, returning what we have 13040 1726882409.21625: results queue empty 13040 1726882409.21626: checking for any_errors_fatal 13040 1726882409.21632: done checking for any_errors_fatal 13040 1726882409.21633: checking for max_fail_percentage 13040 1726882409.21634: done checking for max_fail_percentage 13040 1726882409.21635: checking to see if all hosts have failed and the running result is not ok 13040 1726882409.21636: done checking to see if all hosts have failed 13040 1726882409.21637: getting the remaining hosts for this loop 13040 1726882409.21638: done getting the remaining hosts for this loop 13040 1726882409.21642: getting the next task for host managed_node1 13040 1726882409.21649: done getting next task for host managed_node1 13040 1726882409.21655: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13040 1726882409.21658: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882409.21678: getting variables 13040 1726882409.21680: in VariableManager get_vars() 13040 1726882409.21727: Calling all_inventory to load vars for managed_node1 13040 1726882409.21730: Calling groups_inventory to load vars for managed_node1 13040 1726882409.21732: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882409.21740: Calling all_plugins_play to load vars for managed_node1 13040 1726882409.21743: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882409.21746: Calling groups_plugins_play to load vars for managed_node1 13040 1726882409.21916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882409.22039: done with get_vars() 13040 1726882409.22047: done getting variables 13040 1726882409.22094: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:33:29 -0400 (0:00:00.042) 0:00:06.698 ****** 13040 1726882409.22115: entering _queue_task() for managed_node1/service 13040 1726882409.22303: worker is 1 (out of 1 available) 13040 1726882409.22315: exiting _queue_task() for managed_node1/service 13040 1726882409.22330: done queuing things up, now waiting for results queue to drain 13040 1726882409.22331: waiting for pending results... 13040 1726882409.22499: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13040 1726882409.22581: in run() - task 0e448fcc-3ce9-b123-314b-0000000000da 13040 1726882409.22591: variable 'ansible_search_path' from source: unknown 13040 1726882409.22594: variable 'ansible_search_path' from source: unknown 13040 1726882409.22624: calling self._execute() 13040 1726882409.22693: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882409.22707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882409.22723: variable 'omit' from source: magic vars 13040 1726882409.23092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882409.25206: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882409.25289: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882409.25329: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882409.25370: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882409.25401: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882409.25481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882409.25515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882409.25547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882409.25594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882409.25612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882409.25754: variable 'ansible_distribution' from source: facts 13040 1726882409.25770: variable 'ansible_distribution_major_version' from source: facts 13040 1726882409.25785: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882409.25788: when evaluation is False, skipping this task 13040 1726882409.25790: _execute() done 13040 1726882409.25793: dumping result to json 13040 1726882409.25796: done dumping result, returning 13040 1726882409.25803: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-b123-314b-0000000000da] 13040 1726882409.25808: sending task result for task 0e448fcc-3ce9-b123-314b-0000000000da 13040 1726882409.25912: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000000da 13040 1726882409.25914: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882409.25969: no more pending results, returning what we have 13040 1726882409.25973: results queue empty 13040 1726882409.25974: checking for any_errors_fatal 13040 1726882409.25980: done checking for any_errors_fatal 13040 1726882409.25981: checking for max_fail_percentage 13040 1726882409.25983: done checking for max_fail_percentage 13040 1726882409.25984: checking to see if all hosts have failed and the running result is not ok 13040 1726882409.25984: done checking to see if all hosts have failed 13040 1726882409.25985: getting the remaining hosts for this loop 13040 1726882409.25986: done getting the remaining hosts for this loop 13040 1726882409.25990: getting the next task for host managed_node1 13040 1726882409.25998: done getting next task for host managed_node1 13040 1726882409.26002: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13040 1726882409.26005: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882409.26025: getting variables 13040 1726882409.26027: in VariableManager get_vars() 13040 1726882409.26080: Calling all_inventory to load vars for managed_node1 13040 1726882409.26083: Calling groups_inventory to load vars for managed_node1 13040 1726882409.26085: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882409.26094: Calling all_plugins_play to load vars for managed_node1 13040 1726882409.26096: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882409.26098: Calling groups_plugins_play to load vars for managed_node1 13040 1726882409.26223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882409.26351: done with get_vars() 13040 1726882409.26364: done getting variables 13040 1726882409.26405: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:33:29 -0400 (0:00:00.043) 0:00:06.741 ****** 13040 1726882409.26426: entering _queue_task() for managed_node1/service 13040 1726882409.26625: worker is 1 (out of 1 available) 13040 1726882409.26637: exiting _queue_task() for managed_node1/service 13040 1726882409.26651: done queuing things up, now waiting for results queue to drain 13040 1726882409.26655: waiting for pending results... 13040 1726882409.26830: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13040 1726882409.26916: in run() - task 0e448fcc-3ce9-b123-314b-0000000000db 13040 1726882409.26932: variable 'ansible_search_path' from source: unknown 13040 1726882409.26935: variable 'ansible_search_path' from source: unknown 13040 1726882409.26965: calling self._execute() 13040 1726882409.27033: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882409.27036: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882409.27045: variable 'omit' from source: magic vars 13040 1726882409.27350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882409.29498: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882409.29547: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882409.29577: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882409.29601: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882409.29621: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882409.29683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882409.29702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882409.29719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882409.29745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882409.29757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882409.29857: variable 'ansible_distribution' from source: facts 13040 1726882409.29869: variable 'ansible_distribution_major_version' from source: facts 13040 1726882409.29886: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882409.29889: when evaluation is False, skipping this task 13040 1726882409.29892: _execute() done 13040 1726882409.29894: dumping result to json 13040 1726882409.29896: done dumping result, returning 13040 1726882409.29905: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-b123-314b-0000000000db] 13040 1726882409.29909: sending task result for task 0e448fcc-3ce9-b123-314b-0000000000db 13040 1726882409.29999: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000000db 13040 1726882409.30001: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13040 1726882409.30040: no more pending results, returning what we have 13040 1726882409.30044: results queue empty 13040 1726882409.30045: checking for any_errors_fatal 13040 1726882409.30051: done checking for any_errors_fatal 13040 1726882409.30051: checking for max_fail_percentage 13040 1726882409.30053: done checking for max_fail_percentage 13040 1726882409.30054: checking to see if all hosts have failed and the running result is not ok 13040 1726882409.30055: done checking to see if all hosts have failed 13040 1726882409.30055: getting the remaining hosts for this loop 13040 1726882409.30057: done getting the remaining hosts for this loop 13040 1726882409.30060: getting the next task for host managed_node1 13040 1726882409.30068: done getting next task for host managed_node1 13040 1726882409.30072: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13040 1726882409.30074: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882409.30093: getting variables 13040 1726882409.30095: in VariableManager get_vars() 13040 1726882409.30145: Calling all_inventory to load vars for managed_node1 13040 1726882409.30149: Calling groups_inventory to load vars for managed_node1 13040 1726882409.30151: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882409.30160: Calling all_plugins_play to load vars for managed_node1 13040 1726882409.30162: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882409.30167: Calling groups_plugins_play to load vars for managed_node1 13040 1726882409.30296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882409.30460: done with get_vars() 13040 1726882409.30469: done getting variables 13040 1726882409.30510: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:33:29 -0400 (0:00:00.041) 0:00:06.782 ****** 13040 1726882409.30534: entering _queue_task() for managed_node1/service 13040 1726882409.30732: worker is 1 (out of 1 available) 13040 1726882409.30745: exiting _queue_task() for managed_node1/service 13040 1726882409.30757: done queuing things up, now waiting for results queue to drain 13040 1726882409.30759: waiting for pending results... 13040 1726882409.30931: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13040 1726882409.31014: in run() - task 0e448fcc-3ce9-b123-314b-0000000000dc 13040 1726882409.31026: variable 'ansible_search_path' from source: unknown 13040 1726882409.31029: variable 'ansible_search_path' from source: unknown 13040 1726882409.31062: calling self._execute() 13040 1726882409.31124: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882409.31127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882409.31137: variable 'omit' from source: magic vars 13040 1726882409.31446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882409.33056: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882409.33108: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882409.33135: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882409.33160: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882409.33185: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882409.33243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882409.33271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882409.33291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882409.33317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882409.33327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882409.33430: variable 'ansible_distribution' from source: facts 13040 1726882409.33436: variable 'ansible_distribution_major_version' from source: facts 13040 1726882409.33453: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882409.33456: when evaluation is False, skipping this task 13040 1726882409.33459: _execute() done 13040 1726882409.33462: dumping result to json 13040 1726882409.33466: done dumping result, returning 13040 1726882409.33474: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-b123-314b-0000000000dc] 13040 1726882409.33479: sending task result for task 0e448fcc-3ce9-b123-314b-0000000000dc 13040 1726882409.33571: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000000dc 13040 1726882409.33574: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882409.33644: no more pending results, returning what we have 13040 1726882409.33647: results queue empty 13040 1726882409.33648: checking for any_errors_fatal 13040 1726882409.33657: done checking for any_errors_fatal 13040 1726882409.33658: checking for max_fail_percentage 13040 1726882409.33659: done checking for max_fail_percentage 13040 1726882409.33660: checking to see if all hosts have failed and the running result is not ok 13040 1726882409.33661: done checking to see if all hosts have failed 13040 1726882409.33661: getting the remaining hosts for this loop 13040 1726882409.33663: done getting the remaining hosts for this loop 13040 1726882409.33668: getting the next task for host managed_node1 13040 1726882409.33673: done getting next task for host managed_node1 13040 1726882409.33682: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 13040 1726882409.33684: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882409.33706: getting variables 13040 1726882409.33707: in VariableManager get_vars() 13040 1726882409.33753: Calling all_inventory to load vars for managed_node1 13040 1726882409.33755: Calling groups_inventory to load vars for managed_node1 13040 1726882409.33758: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882409.33767: Calling all_plugins_play to load vars for managed_node1 13040 1726882409.33769: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882409.33771: Calling groups_plugins_play to load vars for managed_node1 13040 1726882409.33889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882409.34016: done with get_vars() 13040 1726882409.34025: done getting variables 13040 1726882409.34068: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:33:29 -0400 (0:00:00.035) 0:00:06.818 ****** 13040 1726882409.34090: entering _queue_task() for managed_node1/service 13040 1726882409.34281: worker is 1 (out of 1 available) 13040 1726882409.34294: exiting _queue_task() for managed_node1/service 13040 1726882409.34307: done queuing things up, now waiting for results queue to drain 13040 1726882409.34308: waiting for pending results... 13040 1726882409.34480: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 13040 1726882409.34560: in run() - task 0e448fcc-3ce9-b123-314b-0000000000dd 13040 1726882409.34573: variable 'ansible_search_path' from source: unknown 13040 1726882409.34577: variable 'ansible_search_path' from source: unknown 13040 1726882409.34603: calling self._execute() 13040 1726882409.34672: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882409.34677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882409.34686: variable 'omit' from source: magic vars 13040 1726882409.34992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882409.36611: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882409.36659: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882409.36688: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882409.36716: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882409.36736: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882409.36794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882409.36814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882409.36834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882409.36865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882409.36876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882409.36975: variable 'ansible_distribution' from source: facts 13040 1726882409.36980: variable 'ansible_distribution_major_version' from source: facts 13040 1726882409.36994: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882409.36997: when evaluation is False, skipping this task 13040 1726882409.36999: _execute() done 13040 1726882409.37002: dumping result to json 13040 1726882409.37004: done dumping result, returning 13040 1726882409.37012: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-b123-314b-0000000000dd] 13040 1726882409.37016: sending task result for task 0e448fcc-3ce9-b123-314b-0000000000dd 13040 1726882409.37104: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000000dd 13040 1726882409.37107: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13040 1726882409.37173: no more pending results, returning what we have 13040 1726882409.37177: results queue empty 13040 1726882409.37178: checking for any_errors_fatal 13040 1726882409.37185: done checking for any_errors_fatal 13040 1726882409.37186: checking for max_fail_percentage 13040 1726882409.37187: done checking for max_fail_percentage 13040 1726882409.37188: checking to see if all hosts have failed and the running result is not ok 13040 1726882409.37189: done checking to see if all hosts have failed 13040 1726882409.37190: getting the remaining hosts for this loop 13040 1726882409.37191: done getting the remaining hosts for this loop 13040 1726882409.37194: getting the next task for host managed_node1 13040 1726882409.37200: done getting next task for host managed_node1 13040 1726882409.37204: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13040 1726882409.37206: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882409.37231: getting variables 13040 1726882409.37232: in VariableManager get_vars() 13040 1726882409.37279: Calling all_inventory to load vars for managed_node1 13040 1726882409.37282: Calling groups_inventory to load vars for managed_node1 13040 1726882409.37284: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882409.37291: Calling all_plugins_play to load vars for managed_node1 13040 1726882409.37293: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882409.37295: Calling groups_plugins_play to load vars for managed_node1 13040 1726882409.37451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882409.37574: done with get_vars() 13040 1726882409.37581: done getting variables 13040 1726882409.37620: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:33:29 -0400 (0:00:00.035) 0:00:06.853 ****** 13040 1726882409.37641: entering _queue_task() for managed_node1/copy 13040 1726882409.37827: worker is 1 (out of 1 available) 13040 1726882409.37840: exiting _queue_task() for managed_node1/copy 13040 1726882409.37851: done queuing things up, now waiting for results queue to drain 13040 1726882409.37852: waiting for pending results... 13040 1726882409.38027: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13040 1726882409.38111: in run() - task 0e448fcc-3ce9-b123-314b-0000000000de 13040 1726882409.38122: variable 'ansible_search_path' from source: unknown 13040 1726882409.38126: variable 'ansible_search_path' from source: unknown 13040 1726882409.38152: calling self._execute() 13040 1726882409.38215: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882409.38225: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882409.38232: variable 'omit' from source: magic vars 13040 1726882409.38536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882409.40087: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882409.40326: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882409.40356: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882409.40380: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882409.40403: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882409.40457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882409.40484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882409.40504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882409.40530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882409.40540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882409.40636: variable 'ansible_distribution' from source: facts 13040 1726882409.40640: variable 'ansible_distribution_major_version' from source: facts 13040 1726882409.40657: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882409.40660: when evaluation is False, skipping this task 13040 1726882409.40665: _execute() done 13040 1726882409.40668: dumping result to json 13040 1726882409.40670: done dumping result, returning 13040 1726882409.40677: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-b123-314b-0000000000de] 13040 1726882409.40683: sending task result for task 0e448fcc-3ce9-b123-314b-0000000000de 13040 1726882409.40779: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000000de 13040 1726882409.40782: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882409.40837: no more pending results, returning what we have 13040 1726882409.40840: results queue empty 13040 1726882409.40841: checking for any_errors_fatal 13040 1726882409.40847: done checking for any_errors_fatal 13040 1726882409.40848: checking for max_fail_percentage 13040 1726882409.40849: done checking for max_fail_percentage 13040 1726882409.40850: checking to see if all hosts have failed and the running result is not ok 13040 1726882409.40851: done checking to see if all hosts have failed 13040 1726882409.40852: getting the remaining hosts for this loop 13040 1726882409.40853: done getting the remaining hosts for this loop 13040 1726882409.40856: getting the next task for host managed_node1 13040 1726882409.40862: done getting next task for host managed_node1 13040 1726882409.40867: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13040 1726882409.40870: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882409.40890: getting variables 13040 1726882409.40892: in VariableManager get_vars() 13040 1726882409.40941: Calling all_inventory to load vars for managed_node1 13040 1726882409.40943: Calling groups_inventory to load vars for managed_node1 13040 1726882409.40945: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882409.40954: Calling all_plugins_play to load vars for managed_node1 13040 1726882409.40956: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882409.40958: Calling groups_plugins_play to load vars for managed_node1 13040 1726882409.41070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882409.41193: done with get_vars() 13040 1726882409.41200: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:33:29 -0400 (0:00:00.036) 0:00:06.889 ****** 13040 1726882409.41259: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 13040 1726882409.41440: worker is 1 (out of 1 available) 13040 1726882409.41452: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 13040 1726882409.41466: done queuing things up, now waiting for results queue to drain 13040 1726882409.41467: waiting for pending results... 13040 1726882409.41643: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13040 1726882409.41718: in run() - task 0e448fcc-3ce9-b123-314b-0000000000df 13040 1726882409.41731: variable 'ansible_search_path' from source: unknown 13040 1726882409.41734: variable 'ansible_search_path' from source: unknown 13040 1726882409.41767: calling self._execute() 13040 1726882409.41827: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882409.41831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882409.41839: variable 'omit' from source: magic vars 13040 1726882409.42148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882409.43938: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882409.43985: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882409.44011: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882409.44035: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882409.44057: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882409.44119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882409.44139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882409.44160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882409.44193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882409.44204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882409.44305: variable 'ansible_distribution' from source: facts 13040 1726882409.44310: variable 'ansible_distribution_major_version' from source: facts 13040 1726882409.44325: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882409.44328: when evaluation is False, skipping this task 13040 1726882409.44331: _execute() done 13040 1726882409.44333: dumping result to json 13040 1726882409.44336: done dumping result, returning 13040 1726882409.44343: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-b123-314b-0000000000df] 13040 1726882409.44347: sending task result for task 0e448fcc-3ce9-b123-314b-0000000000df 13040 1726882409.44444: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000000df 13040 1726882409.44446: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882409.44498: no more pending results, returning what we have 13040 1726882409.44502: results queue empty 13040 1726882409.44503: checking for any_errors_fatal 13040 1726882409.44511: done checking for any_errors_fatal 13040 1726882409.44511: checking for max_fail_percentage 13040 1726882409.44513: done checking for max_fail_percentage 13040 1726882409.44514: checking to see if all hosts have failed and the running result is not ok 13040 1726882409.44515: done checking to see if all hosts have failed 13040 1726882409.44515: getting the remaining hosts for this loop 13040 1726882409.44516: done getting the remaining hosts for this loop 13040 1726882409.44520: getting the next task for host managed_node1 13040 1726882409.44526: done getting next task for host managed_node1 13040 1726882409.44530: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 13040 1726882409.44532: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882409.44550: getting variables 13040 1726882409.44551: in VariableManager get_vars() 13040 1726882409.44608: Calling all_inventory to load vars for managed_node1 13040 1726882409.44611: Calling groups_inventory to load vars for managed_node1 13040 1726882409.44614: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882409.44622: Calling all_plugins_play to load vars for managed_node1 13040 1726882409.44625: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882409.44627: Calling groups_plugins_play to load vars for managed_node1 13040 1726882409.44791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882409.44913: done with get_vars() 13040 1726882409.44921: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:33:29 -0400 (0:00:00.037) 0:00:06.927 ****** 13040 1726882409.44980: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 13040 1726882409.45180: worker is 1 (out of 1 available) 13040 1726882409.45194: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 13040 1726882409.45205: done queuing things up, now waiting for results queue to drain 13040 1726882409.45206: waiting for pending results... 13040 1726882409.45378: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 13040 1726882409.45457: in run() - task 0e448fcc-3ce9-b123-314b-0000000000e0 13040 1726882409.45472: variable 'ansible_search_path' from source: unknown 13040 1726882409.45476: variable 'ansible_search_path' from source: unknown 13040 1726882409.45504: calling self._execute() 13040 1726882409.45575: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882409.45580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882409.45588: variable 'omit' from source: magic vars 13040 1726882409.45895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882409.47668: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882409.47711: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882409.47738: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882409.47775: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882409.47795: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882409.47853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882409.47875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882409.47892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882409.47919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882409.47930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882409.48023: variable 'ansible_distribution' from source: facts 13040 1726882409.48028: variable 'ansible_distribution_major_version' from source: facts 13040 1726882409.48045: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882409.48048: when evaluation is False, skipping this task 13040 1726882409.48055: _execute() done 13040 1726882409.48057: dumping result to json 13040 1726882409.48060: done dumping result, returning 13040 1726882409.48063: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-b123-314b-0000000000e0] 13040 1726882409.48070: sending task result for task 0e448fcc-3ce9-b123-314b-0000000000e0 13040 1726882409.48150: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000000e0 13040 1726882409.48155: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882409.48221: no more pending results, returning what we have 13040 1726882409.48225: results queue empty 13040 1726882409.48226: checking for any_errors_fatal 13040 1726882409.48233: done checking for any_errors_fatal 13040 1726882409.48234: checking for max_fail_percentage 13040 1726882409.48236: done checking for max_fail_percentage 13040 1726882409.48237: checking to see if all hosts have failed and the running result is not ok 13040 1726882409.48237: done checking to see if all hosts have failed 13040 1726882409.48238: getting the remaining hosts for this loop 13040 1726882409.48239: done getting the remaining hosts for this loop 13040 1726882409.48243: getting the next task for host managed_node1 13040 1726882409.48248: done getting next task for host managed_node1 13040 1726882409.48255: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13040 1726882409.48257: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882409.48279: getting variables 13040 1726882409.48281: in VariableManager get_vars() 13040 1726882409.48329: Calling all_inventory to load vars for managed_node1 13040 1726882409.48332: Calling groups_inventory to load vars for managed_node1 13040 1726882409.48334: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882409.48343: Calling all_plugins_play to load vars for managed_node1 13040 1726882409.48345: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882409.48347: Calling groups_plugins_play to load vars for managed_node1 13040 1726882409.48465: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882409.48592: done with get_vars() 13040 1726882409.48600: done getting variables 13040 1726882409.48641: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:33:29 -0400 (0:00:00.036) 0:00:06.963 ****** 13040 1726882409.48668: entering _queue_task() for managed_node1/debug 13040 1726882409.48859: worker is 1 (out of 1 available) 13040 1726882409.48874: exiting _queue_task() for managed_node1/debug 13040 1726882409.48885: done queuing things up, now waiting for results queue to drain 13040 1726882409.48886: waiting for pending results... 13040 1726882409.49050: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13040 1726882409.49135: in run() - task 0e448fcc-3ce9-b123-314b-0000000000e1 13040 1726882409.49147: variable 'ansible_search_path' from source: unknown 13040 1726882409.49151: variable 'ansible_search_path' from source: unknown 13040 1726882409.49180: calling self._execute() 13040 1726882409.49240: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882409.49243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882409.49250: variable 'omit' from source: magic vars 13040 1726882409.49554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882409.51295: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882409.51340: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882409.51367: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882409.51392: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882409.51412: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882409.51469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882409.51489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882409.51508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882409.51534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882409.51548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882409.51644: variable 'ansible_distribution' from source: facts 13040 1726882409.51648: variable 'ansible_distribution_major_version' from source: facts 13040 1726882409.51667: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882409.51670: when evaluation is False, skipping this task 13040 1726882409.51672: _execute() done 13040 1726882409.51675: dumping result to json 13040 1726882409.51677: done dumping result, returning 13040 1726882409.51683: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-b123-314b-0000000000e1] 13040 1726882409.51688: sending task result for task 0e448fcc-3ce9-b123-314b-0000000000e1 13040 1726882409.51770: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000000e1 13040 1726882409.51776: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 13040 1726882409.51817: no more pending results, returning what we have 13040 1726882409.51821: results queue empty 13040 1726882409.51822: checking for any_errors_fatal 13040 1726882409.51827: done checking for any_errors_fatal 13040 1726882409.51827: checking for max_fail_percentage 13040 1726882409.51829: done checking for max_fail_percentage 13040 1726882409.51830: checking to see if all hosts have failed and the running result is not ok 13040 1726882409.51831: done checking to see if all hosts have failed 13040 1726882409.51831: getting the remaining hosts for this loop 13040 1726882409.51833: done getting the remaining hosts for this loop 13040 1726882409.51836: getting the next task for host managed_node1 13040 1726882409.51842: done getting next task for host managed_node1 13040 1726882409.51846: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13040 1726882409.51849: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882409.51871: getting variables 13040 1726882409.51873: in VariableManager get_vars() 13040 1726882409.51922: Calling all_inventory to load vars for managed_node1 13040 1726882409.51925: Calling groups_inventory to load vars for managed_node1 13040 1726882409.51927: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882409.51935: Calling all_plugins_play to load vars for managed_node1 13040 1726882409.51937: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882409.51939: Calling groups_plugins_play to load vars for managed_node1 13040 1726882409.52056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882409.52220: done with get_vars() 13040 1726882409.52227: done getting variables 13040 1726882409.52271: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:33:29 -0400 (0:00:00.036) 0:00:07.000 ****** 13040 1726882409.52293: entering _queue_task() for managed_node1/debug 13040 1726882409.52479: worker is 1 (out of 1 available) 13040 1726882409.52491: exiting _queue_task() for managed_node1/debug 13040 1726882409.52504: done queuing things up, now waiting for results queue to drain 13040 1726882409.52505: waiting for pending results... 13040 1726882409.52666: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13040 1726882409.52744: in run() - task 0e448fcc-3ce9-b123-314b-0000000000e2 13040 1726882409.52763: variable 'ansible_search_path' from source: unknown 13040 1726882409.52769: variable 'ansible_search_path' from source: unknown 13040 1726882409.52795: calling self._execute() 13040 1726882409.52861: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882409.52865: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882409.52877: variable 'omit' from source: magic vars 13040 1726882409.53184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882409.54927: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882409.54972: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882409.55006: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882409.55035: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882409.55053: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882409.55111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882409.55133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882409.55152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882409.55182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882409.55192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882409.55285: variable 'ansible_distribution' from source: facts 13040 1726882409.55289: variable 'ansible_distribution_major_version' from source: facts 13040 1726882409.55304: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882409.55307: when evaluation is False, skipping this task 13040 1726882409.55309: _execute() done 13040 1726882409.55311: dumping result to json 13040 1726882409.55314: done dumping result, returning 13040 1726882409.55321: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-b123-314b-0000000000e2] 13040 1726882409.55325: sending task result for task 0e448fcc-3ce9-b123-314b-0000000000e2 13040 1726882409.55407: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000000e2 13040 1726882409.55410: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 13040 1726882409.55484: no more pending results, returning what we have 13040 1726882409.55488: results queue empty 13040 1726882409.55489: checking for any_errors_fatal 13040 1726882409.55494: done checking for any_errors_fatal 13040 1726882409.55495: checking for max_fail_percentage 13040 1726882409.55497: done checking for max_fail_percentage 13040 1726882409.55498: checking to see if all hosts have failed and the running result is not ok 13040 1726882409.55498: done checking to see if all hosts have failed 13040 1726882409.55499: getting the remaining hosts for this loop 13040 1726882409.55500: done getting the remaining hosts for this loop 13040 1726882409.55504: getting the next task for host managed_node1 13040 1726882409.55509: done getting next task for host managed_node1 13040 1726882409.55513: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13040 1726882409.55515: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882409.55536: getting variables 13040 1726882409.55538: in VariableManager get_vars() 13040 1726882409.55588: Calling all_inventory to load vars for managed_node1 13040 1726882409.55591: Calling groups_inventory to load vars for managed_node1 13040 1726882409.55592: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882409.55599: Calling all_plugins_play to load vars for managed_node1 13040 1726882409.55600: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882409.55602: Calling groups_plugins_play to load vars for managed_node1 13040 1726882409.55712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882409.55834: done with get_vars() 13040 1726882409.55842: done getting variables 13040 1726882409.55885: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:33:29 -0400 (0:00:00.036) 0:00:07.036 ****** 13040 1726882409.55907: entering _queue_task() for managed_node1/debug 13040 1726882409.56088: worker is 1 (out of 1 available) 13040 1726882409.56101: exiting _queue_task() for managed_node1/debug 13040 1726882409.56113: done queuing things up, now waiting for results queue to drain 13040 1726882409.56114: waiting for pending results... 13040 1726882409.56290: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13040 1726882409.56371: in run() - task 0e448fcc-3ce9-b123-314b-0000000000e3 13040 1726882409.56380: variable 'ansible_search_path' from source: unknown 13040 1726882409.56384: variable 'ansible_search_path' from source: unknown 13040 1726882409.56410: calling self._execute() 13040 1726882409.56473: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882409.56476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882409.56485: variable 'omit' from source: magic vars 13040 1726882409.56787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882409.58528: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882409.58573: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882409.58600: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882409.58626: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882409.58645: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882409.58705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882409.58725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882409.58742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882409.58772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882409.58783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882409.58875: variable 'ansible_distribution' from source: facts 13040 1726882409.58879: variable 'ansible_distribution_major_version' from source: facts 13040 1726882409.58893: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882409.58896: when evaluation is False, skipping this task 13040 1726882409.58898: _execute() done 13040 1726882409.58902: dumping result to json 13040 1726882409.58904: done dumping result, returning 13040 1726882409.58910: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-b123-314b-0000000000e3] 13040 1726882409.58920: sending task result for task 0e448fcc-3ce9-b123-314b-0000000000e3 13040 1726882409.59003: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000000e3 13040 1726882409.59005: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 13040 1726882409.59083: no more pending results, returning what we have 13040 1726882409.59086: results queue empty 13040 1726882409.59087: checking for any_errors_fatal 13040 1726882409.59093: done checking for any_errors_fatal 13040 1726882409.59093: checking for max_fail_percentage 13040 1726882409.59095: done checking for max_fail_percentage 13040 1726882409.59096: checking to see if all hosts have failed and the running result is not ok 13040 1726882409.59097: done checking to see if all hosts have failed 13040 1726882409.59097: getting the remaining hosts for this loop 13040 1726882409.59098: done getting the remaining hosts for this loop 13040 1726882409.59102: getting the next task for host managed_node1 13040 1726882409.59107: done getting next task for host managed_node1 13040 1726882409.59112: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 13040 1726882409.59114: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882409.59137: getting variables 13040 1726882409.59139: in VariableManager get_vars() 13040 1726882409.59185: Calling all_inventory to load vars for managed_node1 13040 1726882409.59187: Calling groups_inventory to load vars for managed_node1 13040 1726882409.59188: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882409.59195: Calling all_plugins_play to load vars for managed_node1 13040 1726882409.59197: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882409.59199: Calling groups_plugins_play to load vars for managed_node1 13040 1726882409.59504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882409.59623: done with get_vars() 13040 1726882409.59630: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:33:29 -0400 (0:00:00.037) 0:00:07.074 ****** 13040 1726882409.59699: entering _queue_task() for managed_node1/ping 13040 1726882409.59885: worker is 1 (out of 1 available) 13040 1726882409.59897: exiting _queue_task() for managed_node1/ping 13040 1726882409.59908: done queuing things up, now waiting for results queue to drain 13040 1726882409.59909: waiting for pending results... 13040 1726882409.60091: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 13040 1726882409.60187: in run() - task 0e448fcc-3ce9-b123-314b-0000000000e4 13040 1726882409.60197: variable 'ansible_search_path' from source: unknown 13040 1726882409.60200: variable 'ansible_search_path' from source: unknown 13040 1726882409.60230: calling self._execute() 13040 1726882409.60296: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882409.60300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882409.60308: variable 'omit' from source: magic vars 13040 1726882409.60618: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882409.62212: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882409.62268: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882409.62296: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882409.62320: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882409.62340: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882409.62399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882409.62422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882409.62440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882409.62468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882409.62480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882409.62585: variable 'ansible_distribution' from source: facts 13040 1726882409.62588: variable 'ansible_distribution_major_version' from source: facts 13040 1726882409.62605: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882409.62612: when evaluation is False, skipping this task 13040 1726882409.62615: _execute() done 13040 1726882409.62617: dumping result to json 13040 1726882409.62622: done dumping result, returning 13040 1726882409.62631: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-b123-314b-0000000000e4] 13040 1726882409.62635: sending task result for task 0e448fcc-3ce9-b123-314b-0000000000e4 13040 1726882409.62719: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000000e4 13040 1726882409.62721: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882409.62779: no more pending results, returning what we have 13040 1726882409.62782: results queue empty 13040 1726882409.62783: checking for any_errors_fatal 13040 1726882409.62792: done checking for any_errors_fatal 13040 1726882409.62792: checking for max_fail_percentage 13040 1726882409.62794: done checking for max_fail_percentage 13040 1726882409.62795: checking to see if all hosts have failed and the running result is not ok 13040 1726882409.62796: done checking to see if all hosts have failed 13040 1726882409.62796: getting the remaining hosts for this loop 13040 1726882409.62798: done getting the remaining hosts for this loop 13040 1726882409.62801: getting the next task for host managed_node1 13040 1726882409.62811: done getting next task for host managed_node1 13040 1726882409.62812: ^ task is: TASK: meta (role_complete) 13040 1726882409.62815: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882409.62840: getting variables 13040 1726882409.62842: in VariableManager get_vars() 13040 1726882409.62892: Calling all_inventory to load vars for managed_node1 13040 1726882409.62895: Calling groups_inventory to load vars for managed_node1 13040 1726882409.62897: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882409.62905: Calling all_plugins_play to load vars for managed_node1 13040 1726882409.62907: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882409.62909: Calling groups_plugins_play to load vars for managed_node1 13040 1726882409.63024: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882409.63160: done with get_vars() 13040 1726882409.63170: done getting variables 13040 1726882409.63222: done queuing things up, now waiting for results queue to drain 13040 1726882409.63224: results queue empty 13040 1726882409.63224: checking for any_errors_fatal 13040 1726882409.63226: done checking for any_errors_fatal 13040 1726882409.63226: checking for max_fail_percentage 13040 1726882409.63227: done checking for max_fail_percentage 13040 1726882409.63227: checking to see if all hosts have failed and the running result is not ok 13040 1726882409.63227: done checking to see if all hosts have failed 13040 1726882409.63228: getting the remaining hosts for this loop 13040 1726882409.63229: done getting the remaining hosts for this loop 13040 1726882409.63230: getting the next task for host managed_node1 13040 1726882409.63233: done getting next task for host managed_node1 13040 1726882409.63235: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13040 1726882409.63236: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882409.63242: getting variables 13040 1726882409.63243: in VariableManager get_vars() 13040 1726882409.63258: Calling all_inventory to load vars for managed_node1 13040 1726882409.63260: Calling groups_inventory to load vars for managed_node1 13040 1726882409.63261: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882409.63271: Calling all_plugins_play to load vars for managed_node1 13040 1726882409.63273: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882409.63274: Calling groups_plugins_play to load vars for managed_node1 13040 1726882409.63353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882409.63499: done with get_vars() 13040 1726882409.63504: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:33:29 -0400 (0:00:00.038) 0:00:07.112 ****** 13040 1726882409.63550: entering _queue_task() for managed_node1/include_tasks 13040 1726882409.63744: worker is 1 (out of 1 available) 13040 1726882409.63759: exiting _queue_task() for managed_node1/include_tasks 13040 1726882409.63773: done queuing things up, now waiting for results queue to drain 13040 1726882409.63775: waiting for pending results... 13040 1726882409.63945: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13040 1726882409.64030: in run() - task 0e448fcc-3ce9-b123-314b-00000000011b 13040 1726882409.64048: variable 'ansible_search_path' from source: unknown 13040 1726882409.64052: variable 'ansible_search_path' from source: unknown 13040 1726882409.64085: calling self._execute() 13040 1726882409.64167: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882409.64171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882409.64178: variable 'omit' from source: magic vars 13040 1726882409.64485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882409.66528: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882409.66590: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882409.66617: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882409.66643: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882409.66673: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882409.66728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882409.66748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882409.66770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882409.66801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882409.66812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882409.66912: variable 'ansible_distribution' from source: facts 13040 1726882409.66916: variable 'ansible_distribution_major_version' from source: facts 13040 1726882409.66931: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882409.66934: when evaluation is False, skipping this task 13040 1726882409.66936: _execute() done 13040 1726882409.66939: dumping result to json 13040 1726882409.66941: done dumping result, returning 13040 1726882409.66949: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-b123-314b-00000000011b] 13040 1726882409.66956: sending task result for task 0e448fcc-3ce9-b123-314b-00000000011b 13040 1726882409.67045: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000011b 13040 1726882409.67048: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882409.67097: no more pending results, returning what we have 13040 1726882409.67101: results queue empty 13040 1726882409.67102: checking for any_errors_fatal 13040 1726882409.67104: done checking for any_errors_fatal 13040 1726882409.67105: checking for max_fail_percentage 13040 1726882409.67106: done checking for max_fail_percentage 13040 1726882409.67107: checking to see if all hosts have failed and the running result is not ok 13040 1726882409.67108: done checking to see if all hosts have failed 13040 1726882409.67109: getting the remaining hosts for this loop 13040 1726882409.67110: done getting the remaining hosts for this loop 13040 1726882409.67114: getting the next task for host managed_node1 13040 1726882409.67121: done getting next task for host managed_node1 13040 1726882409.67124: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 13040 1726882409.67127: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882409.67144: getting variables 13040 1726882409.67146: in VariableManager get_vars() 13040 1726882409.67197: Calling all_inventory to load vars for managed_node1 13040 1726882409.67200: Calling groups_inventory to load vars for managed_node1 13040 1726882409.67202: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882409.67210: Calling all_plugins_play to load vars for managed_node1 13040 1726882409.67213: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882409.67215: Calling groups_plugins_play to load vars for managed_node1 13040 1726882409.67339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882409.67470: done with get_vars() 13040 1726882409.67479: done getting variables 13040 1726882409.67519: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:33:29 -0400 (0:00:00.039) 0:00:07.152 ****** 13040 1726882409.67541: entering _queue_task() for managed_node1/debug 13040 1726882409.67737: worker is 1 (out of 1 available) 13040 1726882409.67750: exiting _queue_task() for managed_node1/debug 13040 1726882409.67762: done queuing things up, now waiting for results queue to drain 13040 1726882409.67764: waiting for pending results... 13040 1726882409.67935: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 13040 1726882409.68024: in run() - task 0e448fcc-3ce9-b123-314b-00000000011c 13040 1726882409.68035: variable 'ansible_search_path' from source: unknown 13040 1726882409.68039: variable 'ansible_search_path' from source: unknown 13040 1726882409.68069: calling self._execute() 13040 1726882409.68134: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882409.68138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882409.68146: variable 'omit' from source: magic vars 13040 1726882409.68613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882409.70918: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882409.70997: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882409.71040: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882409.71085: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882409.71118: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882409.71203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882409.71238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882409.71280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882409.71328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882409.71348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882409.71494: variable 'ansible_distribution' from source: facts 13040 1726882409.71509: variable 'ansible_distribution_major_version' from source: facts 13040 1726882409.71531: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882409.71539: when evaluation is False, skipping this task 13040 1726882409.71545: _execute() done 13040 1726882409.71550: dumping result to json 13040 1726882409.71561: done dumping result, returning 13040 1726882409.71575: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-b123-314b-00000000011c] 13040 1726882409.71585: sending task result for task 0e448fcc-3ce9-b123-314b-00000000011c 13040 1726882409.71702: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000011c 13040 1726882409.71710: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 13040 1726882409.71765: no more pending results, returning what we have 13040 1726882409.71769: results queue empty 13040 1726882409.71770: checking for any_errors_fatal 13040 1726882409.71775: done checking for any_errors_fatal 13040 1726882409.71776: checking for max_fail_percentage 13040 1726882409.71778: done checking for max_fail_percentage 13040 1726882409.71779: checking to see if all hosts have failed and the running result is not ok 13040 1726882409.71779: done checking to see if all hosts have failed 13040 1726882409.71780: getting the remaining hosts for this loop 13040 1726882409.71781: done getting the remaining hosts for this loop 13040 1726882409.71785: getting the next task for host managed_node1 13040 1726882409.71792: done getting next task for host managed_node1 13040 1726882409.71797: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13040 1726882409.71800: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882409.71820: getting variables 13040 1726882409.71822: in VariableManager get_vars() 13040 1726882409.71886: Calling all_inventory to load vars for managed_node1 13040 1726882409.71889: Calling groups_inventory to load vars for managed_node1 13040 1726882409.71892: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882409.71901: Calling all_plugins_play to load vars for managed_node1 13040 1726882409.71903: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882409.71906: Calling groups_plugins_play to load vars for managed_node1 13040 1726882409.72077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882409.72279: done with get_vars() 13040 1726882409.72288: done getting variables 13040 1726882409.72328: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:33:29 -0400 (0:00:00.048) 0:00:07.200 ****** 13040 1726882409.72350: entering _queue_task() for managed_node1/fail 13040 1726882409.72536: worker is 1 (out of 1 available) 13040 1726882409.72550: exiting _queue_task() for managed_node1/fail 13040 1726882409.72565: done queuing things up, now waiting for results queue to drain 13040 1726882409.72567: waiting for pending results... 13040 1726882409.72731: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13040 1726882409.72814: in run() - task 0e448fcc-3ce9-b123-314b-00000000011d 13040 1726882409.72825: variable 'ansible_search_path' from source: unknown 13040 1726882409.72829: variable 'ansible_search_path' from source: unknown 13040 1726882409.72858: calling self._execute() 13040 1726882409.72924: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882409.72928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882409.72936: variable 'omit' from source: magic vars 13040 1726882409.73235: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882409.75232: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882409.75290: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882409.75316: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882409.75341: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882409.75365: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882409.75421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882409.75441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882409.75461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882409.75494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882409.75506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882409.75604: variable 'ansible_distribution' from source: facts 13040 1726882409.75609: variable 'ansible_distribution_major_version' from source: facts 13040 1726882409.75623: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882409.75626: when evaluation is False, skipping this task 13040 1726882409.75629: _execute() done 13040 1726882409.75632: dumping result to json 13040 1726882409.75634: done dumping result, returning 13040 1726882409.75642: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-b123-314b-00000000011d] 13040 1726882409.75646: sending task result for task 0e448fcc-3ce9-b123-314b-00000000011d 13040 1726882409.75731: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000011d 13040 1726882409.75733: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882409.75776: no more pending results, returning what we have 13040 1726882409.75780: results queue empty 13040 1726882409.75781: checking for any_errors_fatal 13040 1726882409.75786: done checking for any_errors_fatal 13040 1726882409.75787: checking for max_fail_percentage 13040 1726882409.75789: done checking for max_fail_percentage 13040 1726882409.75789: checking to see if all hosts have failed and the running result is not ok 13040 1726882409.75790: done checking to see if all hosts have failed 13040 1726882409.75791: getting the remaining hosts for this loop 13040 1726882409.75792: done getting the remaining hosts for this loop 13040 1726882409.75796: getting the next task for host managed_node1 13040 1726882409.75802: done getting next task for host managed_node1 13040 1726882409.75805: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13040 1726882409.75808: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882409.75826: getting variables 13040 1726882409.75828: in VariableManager get_vars() 13040 1726882409.75877: Calling all_inventory to load vars for managed_node1 13040 1726882409.75879: Calling groups_inventory to load vars for managed_node1 13040 1726882409.75882: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882409.75890: Calling all_plugins_play to load vars for managed_node1 13040 1726882409.75892: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882409.75894: Calling groups_plugins_play to load vars for managed_node1 13040 1726882409.76009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882409.76142: done with get_vars() 13040 1726882409.76150: done getting variables 13040 1726882409.76190: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:33:29 -0400 (0:00:00.038) 0:00:07.239 ****** 13040 1726882409.76213: entering _queue_task() for managed_node1/fail 13040 1726882409.76390: worker is 1 (out of 1 available) 13040 1726882409.76404: exiting _queue_task() for managed_node1/fail 13040 1726882409.76416: done queuing things up, now waiting for results queue to drain 13040 1726882409.76417: waiting for pending results... 13040 1726882409.76602: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13040 1726882409.76704: in run() - task 0e448fcc-3ce9-b123-314b-00000000011e 13040 1726882409.76723: variable 'ansible_search_path' from source: unknown 13040 1726882409.76730: variable 'ansible_search_path' from source: unknown 13040 1726882409.76771: calling self._execute() 13040 1726882409.76866: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882409.76878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882409.76893: variable 'omit' from source: magic vars 13040 1726882409.77344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882409.79305: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882409.79350: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882409.79380: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882409.79405: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882409.79428: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882409.79490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882409.79510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882409.79528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882409.79554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882409.79569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882409.79666: variable 'ansible_distribution' from source: facts 13040 1726882409.79672: variable 'ansible_distribution_major_version' from source: facts 13040 1726882409.79688: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882409.79693: when evaluation is False, skipping this task 13040 1726882409.79695: _execute() done 13040 1726882409.79697: dumping result to json 13040 1726882409.79701: done dumping result, returning 13040 1726882409.79708: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-b123-314b-00000000011e] 13040 1726882409.79713: sending task result for task 0e448fcc-3ce9-b123-314b-00000000011e 13040 1726882409.79803: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000011e 13040 1726882409.79806: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882409.79845: no more pending results, returning what we have 13040 1726882409.79849: results queue empty 13040 1726882409.79850: checking for any_errors_fatal 13040 1726882409.79857: done checking for any_errors_fatal 13040 1726882409.79857: checking for max_fail_percentage 13040 1726882409.79859: done checking for max_fail_percentage 13040 1726882409.79860: checking to see if all hosts have failed and the running result is not ok 13040 1726882409.79862: done checking to see if all hosts have failed 13040 1726882409.79862: getting the remaining hosts for this loop 13040 1726882409.79865: done getting the remaining hosts for this loop 13040 1726882409.79869: getting the next task for host managed_node1 13040 1726882409.79875: done getting next task for host managed_node1 13040 1726882409.79878: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13040 1726882409.79881: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882409.79899: getting variables 13040 1726882409.79901: in VariableManager get_vars() 13040 1726882409.79945: Calling all_inventory to load vars for managed_node1 13040 1726882409.79948: Calling groups_inventory to load vars for managed_node1 13040 1726882409.79950: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882409.79961: Calling all_plugins_play to load vars for managed_node1 13040 1726882409.79963: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882409.79967: Calling groups_plugins_play to load vars for managed_node1 13040 1726882409.80121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882409.80270: done with get_vars() 13040 1726882409.80278: done getting variables 13040 1726882409.80318: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:33:29 -0400 (0:00:00.041) 0:00:07.280 ****** 13040 1726882409.80343: entering _queue_task() for managed_node1/fail 13040 1726882409.80544: worker is 1 (out of 1 available) 13040 1726882409.80560: exiting _queue_task() for managed_node1/fail 13040 1726882409.80575: done queuing things up, now waiting for results queue to drain 13040 1726882409.80578: waiting for pending results... 13040 1726882409.80818: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13040 1726882409.80957: in run() - task 0e448fcc-3ce9-b123-314b-00000000011f 13040 1726882409.80981: variable 'ansible_search_path' from source: unknown 13040 1726882409.80988: variable 'ansible_search_path' from source: unknown 13040 1726882409.81025: calling self._execute() 13040 1726882409.81122: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882409.81132: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882409.81145: variable 'omit' from source: magic vars 13040 1726882409.81569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882409.83520: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882409.83577: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882409.83605: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882409.83629: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882409.83649: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882409.83714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882409.83733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882409.83751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882409.83784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882409.83794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882409.83890: variable 'ansible_distribution' from source: facts 13040 1726882409.83894: variable 'ansible_distribution_major_version' from source: facts 13040 1726882409.83908: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882409.83911: when evaluation is False, skipping this task 13040 1726882409.83914: _execute() done 13040 1726882409.83916: dumping result to json 13040 1726882409.83918: done dumping result, returning 13040 1726882409.83927: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-b123-314b-00000000011f] 13040 1726882409.83933: sending task result for task 0e448fcc-3ce9-b123-314b-00000000011f 13040 1726882409.84018: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000011f 13040 1726882409.84020: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882409.84080: no more pending results, returning what we have 13040 1726882409.84084: results queue empty 13040 1726882409.84085: checking for any_errors_fatal 13040 1726882409.84090: done checking for any_errors_fatal 13040 1726882409.84090: checking for max_fail_percentage 13040 1726882409.84092: done checking for max_fail_percentage 13040 1726882409.84093: checking to see if all hosts have failed and the running result is not ok 13040 1726882409.84093: done checking to see if all hosts have failed 13040 1726882409.84094: getting the remaining hosts for this loop 13040 1726882409.84095: done getting the remaining hosts for this loop 13040 1726882409.84099: getting the next task for host managed_node1 13040 1726882409.84104: done getting next task for host managed_node1 13040 1726882409.84108: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13040 1726882409.84111: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882409.84128: getting variables 13040 1726882409.84132: in VariableManager get_vars() 13040 1726882409.84177: Calling all_inventory to load vars for managed_node1 13040 1726882409.84180: Calling groups_inventory to load vars for managed_node1 13040 1726882409.84182: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882409.84190: Calling all_plugins_play to load vars for managed_node1 13040 1726882409.84193: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882409.84195: Calling groups_plugins_play to load vars for managed_node1 13040 1726882409.84325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882409.84461: done with get_vars() 13040 1726882409.84471: done getting variables 13040 1726882409.84510: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:33:29 -0400 (0:00:00.041) 0:00:07.322 ****** 13040 1726882409.84532: entering _queue_task() for managed_node1/dnf 13040 1726882409.84721: worker is 1 (out of 1 available) 13040 1726882409.84730: exiting _queue_task() for managed_node1/dnf 13040 1726882409.84741: done queuing things up, now waiting for results queue to drain 13040 1726882409.84742: waiting for pending results... 13040 1726882409.85007: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13040 1726882409.85138: in run() - task 0e448fcc-3ce9-b123-314b-000000000120 13040 1726882409.85162: variable 'ansible_search_path' from source: unknown 13040 1726882409.85173: variable 'ansible_search_path' from source: unknown 13040 1726882409.85210: calling self._execute() 13040 1726882409.85303: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882409.85313: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882409.85326: variable 'omit' from source: magic vars 13040 1726882409.85743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882409.87368: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882409.87411: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882409.87445: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882409.87473: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882409.87492: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882409.87555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882409.87575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882409.87593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882409.87618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882409.87630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882409.87726: variable 'ansible_distribution' from source: facts 13040 1726882409.87730: variable 'ansible_distribution_major_version' from source: facts 13040 1726882409.87750: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882409.87753: when evaluation is False, skipping this task 13040 1726882409.87758: _execute() done 13040 1726882409.87761: dumping result to json 13040 1726882409.87763: done dumping result, returning 13040 1726882409.87777: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-b123-314b-000000000120] 13040 1726882409.87780: sending task result for task 0e448fcc-3ce9-b123-314b-000000000120 13040 1726882409.87871: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000120 13040 1726882409.87875: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882409.87920: no more pending results, returning what we have 13040 1726882409.87923: results queue empty 13040 1726882409.87924: checking for any_errors_fatal 13040 1726882409.87929: done checking for any_errors_fatal 13040 1726882409.87930: checking for max_fail_percentage 13040 1726882409.87931: done checking for max_fail_percentage 13040 1726882409.87932: checking to see if all hosts have failed and the running result is not ok 13040 1726882409.87933: done checking to see if all hosts have failed 13040 1726882409.87933: getting the remaining hosts for this loop 13040 1726882409.87935: done getting the remaining hosts for this loop 13040 1726882409.87938: getting the next task for host managed_node1 13040 1726882409.87944: done getting next task for host managed_node1 13040 1726882409.87948: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13040 1726882409.87950: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882409.87972: getting variables 13040 1726882409.87973: in VariableManager get_vars() 13040 1726882409.88018: Calling all_inventory to load vars for managed_node1 13040 1726882409.88021: Calling groups_inventory to load vars for managed_node1 13040 1726882409.88023: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882409.88032: Calling all_plugins_play to load vars for managed_node1 13040 1726882409.88034: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882409.88037: Calling groups_plugins_play to load vars for managed_node1 13040 1726882409.88204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882409.88326: done with get_vars() 13040 1726882409.88333: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13040 1726882409.88386: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:33:29 -0400 (0:00:00.038) 0:00:07.361 ****** 13040 1726882409.88407: entering _queue_task() for managed_node1/yum 13040 1726882409.88587: worker is 1 (out of 1 available) 13040 1726882409.88600: exiting _queue_task() for managed_node1/yum 13040 1726882409.88613: done queuing things up, now waiting for results queue to drain 13040 1726882409.88614: waiting for pending results... 13040 1726882409.88780: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13040 1726882409.88854: in run() - task 0e448fcc-3ce9-b123-314b-000000000121 13040 1726882409.88868: variable 'ansible_search_path' from source: unknown 13040 1726882409.88876: variable 'ansible_search_path' from source: unknown 13040 1726882409.88897: calling self._execute() 13040 1726882409.88962: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882409.88968: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882409.88980: variable 'omit' from source: magic vars 13040 1726882409.89269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882409.90880: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882409.90935: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882409.90966: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882409.90992: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882409.91014: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882409.91074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882409.91094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882409.91113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882409.91143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882409.91151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882409.91252: variable 'ansible_distribution' from source: facts 13040 1726882409.91258: variable 'ansible_distribution_major_version' from source: facts 13040 1726882409.91274: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882409.91277: when evaluation is False, skipping this task 13040 1726882409.91279: _execute() done 13040 1726882409.91281: dumping result to json 13040 1726882409.91285: done dumping result, returning 13040 1726882409.91292: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-b123-314b-000000000121] 13040 1726882409.91297: sending task result for task 0e448fcc-3ce9-b123-314b-000000000121 13040 1726882409.91393: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000121 13040 1726882409.91396: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882409.91440: no more pending results, returning what we have 13040 1726882409.91444: results queue empty 13040 1726882409.91445: checking for any_errors_fatal 13040 1726882409.91450: done checking for any_errors_fatal 13040 1726882409.91451: checking for max_fail_percentage 13040 1726882409.91453: done checking for max_fail_percentage 13040 1726882409.91454: checking to see if all hosts have failed and the running result is not ok 13040 1726882409.91454: done checking to see if all hosts have failed 13040 1726882409.91455: getting the remaining hosts for this loop 13040 1726882409.91456: done getting the remaining hosts for this loop 13040 1726882409.91460: getting the next task for host managed_node1 13040 1726882409.91468: done getting next task for host managed_node1 13040 1726882409.91472: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13040 1726882409.91480: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882409.91499: getting variables 13040 1726882409.91500: in VariableManager get_vars() 13040 1726882409.91546: Calling all_inventory to load vars for managed_node1 13040 1726882409.91548: Calling groups_inventory to load vars for managed_node1 13040 1726882409.91550: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882409.91558: Calling all_plugins_play to load vars for managed_node1 13040 1726882409.91560: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882409.91565: Calling groups_plugins_play to load vars for managed_node1 13040 1726882409.91683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882409.91813: done with get_vars() 13040 1726882409.91823: done getting variables 13040 1726882409.91881: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:33:29 -0400 (0:00:00.035) 0:00:07.396 ****** 13040 1726882409.91913: entering _queue_task() for managed_node1/fail 13040 1726882409.92159: worker is 1 (out of 1 available) 13040 1726882409.92173: exiting _queue_task() for managed_node1/fail 13040 1726882409.92186: done queuing things up, now waiting for results queue to drain 13040 1726882409.92188: waiting for pending results... 13040 1726882409.92473: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13040 1726882409.92599: in run() - task 0e448fcc-3ce9-b123-314b-000000000122 13040 1726882409.92620: variable 'ansible_search_path' from source: unknown 13040 1726882409.92633: variable 'ansible_search_path' from source: unknown 13040 1726882409.92679: calling self._execute() 13040 1726882409.92781: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882409.92793: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882409.92808: variable 'omit' from source: magic vars 13040 1726882409.93244: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882409.95548: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882409.95599: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882409.95625: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882409.95650: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882409.95676: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882409.95732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882409.95755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882409.95773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882409.95800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882409.95815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882409.95919: variable 'ansible_distribution' from source: facts 13040 1726882409.95924: variable 'ansible_distribution_major_version' from source: facts 13040 1726882409.95939: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882409.95941: when evaluation is False, skipping this task 13040 1726882409.95944: _execute() done 13040 1726882409.95946: dumping result to json 13040 1726882409.95950: done dumping result, returning 13040 1726882409.95959: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-b123-314b-000000000122] 13040 1726882409.95966: sending task result for task 0e448fcc-3ce9-b123-314b-000000000122 13040 1726882409.96055: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000122 13040 1726882409.96057: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882409.96107: no more pending results, returning what we have 13040 1726882409.96111: results queue empty 13040 1726882409.96111: checking for any_errors_fatal 13040 1726882409.96118: done checking for any_errors_fatal 13040 1726882409.96118: checking for max_fail_percentage 13040 1726882409.96120: done checking for max_fail_percentage 13040 1726882409.96121: checking to see if all hosts have failed and the running result is not ok 13040 1726882409.96122: done checking to see if all hosts have failed 13040 1726882409.96122: getting the remaining hosts for this loop 13040 1726882409.96124: done getting the remaining hosts for this loop 13040 1726882409.96127: getting the next task for host managed_node1 13040 1726882409.96133: done getting next task for host managed_node1 13040 1726882409.96137: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 13040 1726882409.96140: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882409.96159: getting variables 13040 1726882409.96161: in VariableManager get_vars() 13040 1726882409.96210: Calling all_inventory to load vars for managed_node1 13040 1726882409.96213: Calling groups_inventory to load vars for managed_node1 13040 1726882409.96215: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882409.96223: Calling all_plugins_play to load vars for managed_node1 13040 1726882409.96225: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882409.96228: Calling groups_plugins_play to load vars for managed_node1 13040 1726882409.96394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882409.96517: done with get_vars() 13040 1726882409.96524: done getting variables 13040 1726882409.96568: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:33:29 -0400 (0:00:00.046) 0:00:07.443 ****** 13040 1726882409.96589: entering _queue_task() for managed_node1/package 13040 1726882409.96775: worker is 1 (out of 1 available) 13040 1726882409.96789: exiting _queue_task() for managed_node1/package 13040 1726882409.96802: done queuing things up, now waiting for results queue to drain 13040 1726882409.96804: waiting for pending results... 13040 1726882409.96979: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 13040 1726882409.97057: in run() - task 0e448fcc-3ce9-b123-314b-000000000123 13040 1726882409.97072: variable 'ansible_search_path' from source: unknown 13040 1726882409.97075: variable 'ansible_search_path' from source: unknown 13040 1726882409.97104: calling self._execute() 13040 1726882409.97170: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882409.97174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882409.97183: variable 'omit' from source: magic vars 13040 1726882409.97490: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882409.99090: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882409.99142: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882409.99169: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882409.99199: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882409.99220: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882409.99282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882409.99306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882409.99326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882409.99354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882409.99365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882409.99463: variable 'ansible_distribution' from source: facts 13040 1726882409.99471: variable 'ansible_distribution_major_version' from source: facts 13040 1726882409.99485: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882409.99488: when evaluation is False, skipping this task 13040 1726882409.99490: _execute() done 13040 1726882409.99494: dumping result to json 13040 1726882409.99496: done dumping result, returning 13040 1726882409.99506: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-b123-314b-000000000123] 13040 1726882409.99509: sending task result for task 0e448fcc-3ce9-b123-314b-000000000123 13040 1726882409.99602: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000123 13040 1726882409.99608: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882409.99683: no more pending results, returning what we have 13040 1726882409.99687: results queue empty 13040 1726882409.99688: checking for any_errors_fatal 13040 1726882409.99694: done checking for any_errors_fatal 13040 1726882409.99695: checking for max_fail_percentage 13040 1726882409.99696: done checking for max_fail_percentage 13040 1726882409.99697: checking to see if all hosts have failed and the running result is not ok 13040 1726882409.99698: done checking to see if all hosts have failed 13040 1726882409.99699: getting the remaining hosts for this loop 13040 1726882409.99700: done getting the remaining hosts for this loop 13040 1726882409.99703: getting the next task for host managed_node1 13040 1726882409.99714: done getting next task for host managed_node1 13040 1726882409.99721: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13040 1726882409.99729: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882409.99748: getting variables 13040 1726882409.99749: in VariableManager get_vars() 13040 1726882409.99797: Calling all_inventory to load vars for managed_node1 13040 1726882409.99800: Calling groups_inventory to load vars for managed_node1 13040 1726882409.99802: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882409.99811: Calling all_plugins_play to load vars for managed_node1 13040 1726882409.99813: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882409.99815: Calling groups_plugins_play to load vars for managed_node1 13040 1726882409.99928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882410.00059: done with get_vars() 13040 1726882410.00068: done getting variables 13040 1726882410.00109: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:33:30 -0400 (0:00:00.035) 0:00:07.478 ****** 13040 1726882410.00131: entering _queue_task() for managed_node1/package 13040 1726882410.00333: worker is 1 (out of 1 available) 13040 1726882410.00347: exiting _queue_task() for managed_node1/package 13040 1726882410.00362: done queuing things up, now waiting for results queue to drain 13040 1726882410.00365: waiting for pending results... 13040 1726882410.00527: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13040 1726882410.00611: in run() - task 0e448fcc-3ce9-b123-314b-000000000124 13040 1726882410.00622: variable 'ansible_search_path' from source: unknown 13040 1726882410.00626: variable 'ansible_search_path' from source: unknown 13040 1726882410.00657: calling self._execute() 13040 1726882410.00722: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882410.00725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882410.00734: variable 'omit' from source: magic vars 13040 1726882410.01090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882410.03212: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882410.03262: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882410.03290: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882410.03314: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882410.03335: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882410.03395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882410.03415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882410.03435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882410.03462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882410.03479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882410.03579: variable 'ansible_distribution' from source: facts 13040 1726882410.03582: variable 'ansible_distribution_major_version' from source: facts 13040 1726882410.03596: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882410.03599: when evaluation is False, skipping this task 13040 1726882410.03601: _execute() done 13040 1726882410.03603: dumping result to json 13040 1726882410.03606: done dumping result, returning 13040 1726882410.03613: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-b123-314b-000000000124] 13040 1726882410.03619: sending task result for task 0e448fcc-3ce9-b123-314b-000000000124 13040 1726882410.03713: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000124 13040 1726882410.03716: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882410.03766: no more pending results, returning what we have 13040 1726882410.03770: results queue empty 13040 1726882410.03771: checking for any_errors_fatal 13040 1726882410.03777: done checking for any_errors_fatal 13040 1726882410.03777: checking for max_fail_percentage 13040 1726882410.03779: done checking for max_fail_percentage 13040 1726882410.03780: checking to see if all hosts have failed and the running result is not ok 13040 1726882410.03781: done checking to see if all hosts have failed 13040 1726882410.03781: getting the remaining hosts for this loop 13040 1726882410.03782: done getting the remaining hosts for this loop 13040 1726882410.03786: getting the next task for host managed_node1 13040 1726882410.03792: done getting next task for host managed_node1 13040 1726882410.03796: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13040 1726882410.03799: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882410.03817: getting variables 13040 1726882410.03818: in VariableManager get_vars() 13040 1726882410.03878: Calling all_inventory to load vars for managed_node1 13040 1726882410.03881: Calling groups_inventory to load vars for managed_node1 13040 1726882410.03883: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882410.03891: Calling all_plugins_play to load vars for managed_node1 13040 1726882410.03893: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882410.03895: Calling groups_plugins_play to load vars for managed_node1 13040 1726882410.04047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882410.04183: done with get_vars() 13040 1726882410.04193: done getting variables 13040 1726882410.04234: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:33:30 -0400 (0:00:00.041) 0:00:07.519 ****** 13040 1726882410.04256: entering _queue_task() for managed_node1/package 13040 1726882410.04447: worker is 1 (out of 1 available) 13040 1726882410.04459: exiting _queue_task() for managed_node1/package 13040 1726882410.04473: done queuing things up, now waiting for results queue to drain 13040 1726882410.04474: waiting for pending results... 13040 1726882410.04709: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13040 1726882410.04772: in run() - task 0e448fcc-3ce9-b123-314b-000000000125 13040 1726882410.04787: variable 'ansible_search_path' from source: unknown 13040 1726882410.04800: variable 'ansible_search_path' from source: unknown 13040 1726882410.04828: calling self._execute() 13040 1726882410.04906: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882410.04910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882410.04918: variable 'omit' from source: magic vars 13040 1726882410.05287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882410.07249: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882410.07310: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882410.07336: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882410.07365: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882410.07385: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882410.07442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882410.07466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882410.07484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882410.07513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882410.07524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882410.07619: variable 'ansible_distribution' from source: facts 13040 1726882410.07628: variable 'ansible_distribution_major_version' from source: facts 13040 1726882410.07640: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882410.07643: when evaluation is False, skipping this task 13040 1726882410.07645: _execute() done 13040 1726882410.07648: dumping result to json 13040 1726882410.07651: done dumping result, returning 13040 1726882410.07661: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-b123-314b-000000000125] 13040 1726882410.07668: sending task result for task 0e448fcc-3ce9-b123-314b-000000000125 13040 1726882410.07758: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000125 13040 1726882410.07761: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882410.07805: no more pending results, returning what we have 13040 1726882410.07808: results queue empty 13040 1726882410.07809: checking for any_errors_fatal 13040 1726882410.07817: done checking for any_errors_fatal 13040 1726882410.07817: checking for max_fail_percentage 13040 1726882410.07819: done checking for max_fail_percentage 13040 1726882410.07820: checking to see if all hosts have failed and the running result is not ok 13040 1726882410.07821: done checking to see if all hosts have failed 13040 1726882410.07821: getting the remaining hosts for this loop 13040 1726882410.07823: done getting the remaining hosts for this loop 13040 1726882410.07826: getting the next task for host managed_node1 13040 1726882410.07833: done getting next task for host managed_node1 13040 1726882410.07836: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13040 1726882410.07839: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882410.07857: getting variables 13040 1726882410.07859: in VariableManager get_vars() 13040 1726882410.07907: Calling all_inventory to load vars for managed_node1 13040 1726882410.07910: Calling groups_inventory to load vars for managed_node1 13040 1726882410.07912: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882410.07920: Calling all_plugins_play to load vars for managed_node1 13040 1726882410.07922: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882410.07925: Calling groups_plugins_play to load vars for managed_node1 13040 1726882410.08042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882410.08169: done with get_vars() 13040 1726882410.08179: done getting variables 13040 1726882410.08220: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:33:30 -0400 (0:00:00.039) 0:00:07.559 ****** 13040 1726882410.08241: entering _queue_task() for managed_node1/service 13040 1726882410.08490: worker is 1 (out of 1 available) 13040 1726882410.08510: exiting _queue_task() for managed_node1/service 13040 1726882410.08522: done queuing things up, now waiting for results queue to drain 13040 1726882410.08523: waiting for pending results... 13040 1726882410.08797: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13040 1726882410.08936: in run() - task 0e448fcc-3ce9-b123-314b-000000000126 13040 1726882410.08956: variable 'ansible_search_path' from source: unknown 13040 1726882410.08966: variable 'ansible_search_path' from source: unknown 13040 1726882410.09007: calling self._execute() 13040 1726882410.09096: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882410.09108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882410.09122: variable 'omit' from source: magic vars 13040 1726882410.09548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882410.11771: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882410.11848: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882410.11891: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882410.11928: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882410.11959: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882410.12038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882410.12073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882410.12102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882410.12145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882410.12171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882410.12306: variable 'ansible_distribution' from source: facts 13040 1726882410.12317: variable 'ansible_distribution_major_version' from source: facts 13040 1726882410.12340: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882410.12347: when evaluation is False, skipping this task 13040 1726882410.12355: _execute() done 13040 1726882410.12360: dumping result to json 13040 1726882410.12369: done dumping result, returning 13040 1726882410.12380: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-b123-314b-000000000126] 13040 1726882410.12390: sending task result for task 0e448fcc-3ce9-b123-314b-000000000126 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882410.12541: no more pending results, returning what we have 13040 1726882410.12546: results queue empty 13040 1726882410.12547: checking for any_errors_fatal 13040 1726882410.12556: done checking for any_errors_fatal 13040 1726882410.12557: checking for max_fail_percentage 13040 1726882410.12559: done checking for max_fail_percentage 13040 1726882410.12559: checking to see if all hosts have failed and the running result is not ok 13040 1726882410.12560: done checking to see if all hosts have failed 13040 1726882410.12561: getting the remaining hosts for this loop 13040 1726882410.12562: done getting the remaining hosts for this loop 13040 1726882410.12567: getting the next task for host managed_node1 13040 1726882410.12574: done getting next task for host managed_node1 13040 1726882410.12578: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13040 1726882410.12581: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882410.12604: getting variables 13040 1726882410.12606: in VariableManager get_vars() 13040 1726882410.12659: Calling all_inventory to load vars for managed_node1 13040 1726882410.12662: Calling groups_inventory to load vars for managed_node1 13040 1726882410.12666: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882410.12673: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000126 13040 1726882410.12677: WORKER PROCESS EXITING 13040 1726882410.12688: Calling all_plugins_play to load vars for managed_node1 13040 1726882410.12691: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882410.12694: Calling groups_plugins_play to load vars for managed_node1 13040 1726882410.12927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882410.13123: done with get_vars() 13040 1726882410.13134: done getting variables 13040 1726882410.13193: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:33:30 -0400 (0:00:00.049) 0:00:07.609 ****** 13040 1726882410.13226: entering _queue_task() for managed_node1/service 13040 1726882410.13504: worker is 1 (out of 1 available) 13040 1726882410.13518: exiting _queue_task() for managed_node1/service 13040 1726882410.13530: done queuing things up, now waiting for results queue to drain 13040 1726882410.13532: waiting for pending results... 13040 1726882410.13828: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13040 1726882410.13971: in run() - task 0e448fcc-3ce9-b123-314b-000000000127 13040 1726882410.14000: variable 'ansible_search_path' from source: unknown 13040 1726882410.14007: variable 'ansible_search_path' from source: unknown 13040 1726882410.14047: calling self._execute() 13040 1726882410.14142: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882410.14156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882410.14172: variable 'omit' from source: magic vars 13040 1726882410.14635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882410.17081: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882410.17177: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882410.17222: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882410.17274: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882410.17305: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882410.17398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882410.17431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882410.17466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882410.17514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882410.17530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882410.17665: variable 'ansible_distribution' from source: facts 13040 1726882410.17676: variable 'ansible_distribution_major_version' from source: facts 13040 1726882410.17703: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882410.17709: when evaluation is False, skipping this task 13040 1726882410.17713: _execute() done 13040 1726882410.17718: dumping result to json 13040 1726882410.17723: done dumping result, returning 13040 1726882410.17733: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-b123-314b-000000000127] 13040 1726882410.17741: sending task result for task 0e448fcc-3ce9-b123-314b-000000000127 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13040 1726882410.17880: no more pending results, returning what we have 13040 1726882410.17884: results queue empty 13040 1726882410.17885: checking for any_errors_fatal 13040 1726882410.17892: done checking for any_errors_fatal 13040 1726882410.17893: checking for max_fail_percentage 13040 1726882410.17895: done checking for max_fail_percentage 13040 1726882410.17895: checking to see if all hosts have failed and the running result is not ok 13040 1726882410.17896: done checking to see if all hosts have failed 13040 1726882410.17897: getting the remaining hosts for this loop 13040 1726882410.17898: done getting the remaining hosts for this loop 13040 1726882410.17901: getting the next task for host managed_node1 13040 1726882410.17908: done getting next task for host managed_node1 13040 1726882410.17912: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13040 1726882410.17915: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882410.17934: getting variables 13040 1726882410.17936: in VariableManager get_vars() 13040 1726882410.17990: Calling all_inventory to load vars for managed_node1 13040 1726882410.17993: Calling groups_inventory to load vars for managed_node1 13040 1726882410.17996: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882410.18007: Calling all_plugins_play to load vars for managed_node1 13040 1726882410.18009: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882410.18012: Calling groups_plugins_play to load vars for managed_node1 13040 1726882410.18203: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882410.18432: done with get_vars() 13040 1726882410.18442: done getting variables 13040 1726882410.18570: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000127 13040 1726882410.18574: WORKER PROCESS EXITING 13040 1726882410.18616: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:33:30 -0400 (0:00:00.054) 0:00:07.663 ****** 13040 1726882410.18659: entering _queue_task() for managed_node1/service 13040 1726882410.19129: worker is 1 (out of 1 available) 13040 1726882410.19142: exiting _queue_task() for managed_node1/service 13040 1726882410.19162: done queuing things up, now waiting for results queue to drain 13040 1726882410.19163: waiting for pending results... 13040 1726882410.19405: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13040 1726882410.19488: in run() - task 0e448fcc-3ce9-b123-314b-000000000128 13040 1726882410.19499: variable 'ansible_search_path' from source: unknown 13040 1726882410.19504: variable 'ansible_search_path' from source: unknown 13040 1726882410.19536: calling self._execute() 13040 1726882410.19604: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882410.19608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882410.19617: variable 'omit' from source: magic vars 13040 1726882410.19941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882410.21728: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882410.21775: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882410.21802: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882410.21829: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882410.21848: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882410.21914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882410.21945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882410.21977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882410.22019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882410.22034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882410.22160: variable 'ansible_distribution' from source: facts 13040 1726882410.22173: variable 'ansible_distribution_major_version' from source: facts 13040 1726882410.22192: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882410.22198: when evaluation is False, skipping this task 13040 1726882410.22204: _execute() done 13040 1726882410.22209: dumping result to json 13040 1726882410.22214: done dumping result, returning 13040 1726882410.22223: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-b123-314b-000000000128] 13040 1726882410.22231: sending task result for task 0e448fcc-3ce9-b123-314b-000000000128 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882410.22373: no more pending results, returning what we have 13040 1726882410.22377: results queue empty 13040 1726882410.22378: checking for any_errors_fatal 13040 1726882410.22387: done checking for any_errors_fatal 13040 1726882410.22388: checking for max_fail_percentage 13040 1726882410.22389: done checking for max_fail_percentage 13040 1726882410.22390: checking to see if all hosts have failed and the running result is not ok 13040 1726882410.22391: done checking to see if all hosts have failed 13040 1726882410.22392: getting the remaining hosts for this loop 13040 1726882410.22393: done getting the remaining hosts for this loop 13040 1726882410.22397: getting the next task for host managed_node1 13040 1726882410.22403: done getting next task for host managed_node1 13040 1726882410.22407: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 13040 1726882410.22410: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882410.22428: getting variables 13040 1726882410.22430: in VariableManager get_vars() 13040 1726882410.22483: Calling all_inventory to load vars for managed_node1 13040 1726882410.22486: Calling groups_inventory to load vars for managed_node1 13040 1726882410.22488: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882410.22498: Calling all_plugins_play to load vars for managed_node1 13040 1726882410.22501: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882410.22503: Calling groups_plugins_play to load vars for managed_node1 13040 1726882410.22719: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000128 13040 1726882410.22722: WORKER PROCESS EXITING 13040 1726882410.22728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882410.22914: done with get_vars() 13040 1726882410.22923: done getting variables 13040 1726882410.22975: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:33:30 -0400 (0:00:00.043) 0:00:07.707 ****** 13040 1726882410.23006: entering _queue_task() for managed_node1/service 13040 1726882410.23211: worker is 1 (out of 1 available) 13040 1726882410.23226: exiting _queue_task() for managed_node1/service 13040 1726882410.23239: done queuing things up, now waiting for results queue to drain 13040 1726882410.23241: waiting for pending results... 13040 1726882410.23403: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 13040 1726882410.23489: in run() - task 0e448fcc-3ce9-b123-314b-000000000129 13040 1726882410.23500: variable 'ansible_search_path' from source: unknown 13040 1726882410.23503: variable 'ansible_search_path' from source: unknown 13040 1726882410.23530: calling self._execute() 13040 1726882410.23593: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882410.23597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882410.23605: variable 'omit' from source: magic vars 13040 1726882410.23902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882410.25449: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882410.25506: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882410.25532: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882410.25563: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882410.25585: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882410.25644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882410.25671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882410.25689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882410.25716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882410.25726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882410.25825: variable 'ansible_distribution' from source: facts 13040 1726882410.25831: variable 'ansible_distribution_major_version' from source: facts 13040 1726882410.25846: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882410.25848: when evaluation is False, skipping this task 13040 1726882410.25851: _execute() done 13040 1726882410.25853: dumping result to json 13040 1726882410.25859: done dumping result, returning 13040 1726882410.25870: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-b123-314b-000000000129] 13040 1726882410.25881: sending task result for task 0e448fcc-3ce9-b123-314b-000000000129 13040 1726882410.25962: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000129 13040 1726882410.25966: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13040 1726882410.26015: no more pending results, returning what we have 13040 1726882410.26019: results queue empty 13040 1726882410.26020: checking for any_errors_fatal 13040 1726882410.26027: done checking for any_errors_fatal 13040 1726882410.26028: checking for max_fail_percentage 13040 1726882410.26029: done checking for max_fail_percentage 13040 1726882410.26030: checking to see if all hosts have failed and the running result is not ok 13040 1726882410.26032: done checking to see if all hosts have failed 13040 1726882410.26032: getting the remaining hosts for this loop 13040 1726882410.26034: done getting the remaining hosts for this loop 13040 1726882410.26037: getting the next task for host managed_node1 13040 1726882410.26043: done getting next task for host managed_node1 13040 1726882410.26047: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13040 1726882410.26050: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882410.26071: getting variables 13040 1726882410.26072: in VariableManager get_vars() 13040 1726882410.26124: Calling all_inventory to load vars for managed_node1 13040 1726882410.26127: Calling groups_inventory to load vars for managed_node1 13040 1726882410.26129: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882410.26137: Calling all_plugins_play to load vars for managed_node1 13040 1726882410.26139: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882410.26141: Calling groups_plugins_play to load vars for managed_node1 13040 1726882410.26259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882410.26385: done with get_vars() 13040 1726882410.26393: done getting variables 13040 1726882410.26434: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:33:30 -0400 (0:00:00.034) 0:00:07.741 ****** 13040 1726882410.26457: entering _queue_task() for managed_node1/copy 13040 1726882410.26658: worker is 1 (out of 1 available) 13040 1726882410.26674: exiting _queue_task() for managed_node1/copy 13040 1726882410.26687: done queuing things up, now waiting for results queue to drain 13040 1726882410.26688: waiting for pending results... 13040 1726882410.26860: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13040 1726882410.26943: in run() - task 0e448fcc-3ce9-b123-314b-00000000012a 13040 1726882410.26954: variable 'ansible_search_path' from source: unknown 13040 1726882410.26960: variable 'ansible_search_path' from source: unknown 13040 1726882410.26991: calling self._execute() 13040 1726882410.27055: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882410.27062: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882410.27075: variable 'omit' from source: magic vars 13040 1726882410.27401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882410.29088: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882410.29155: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882410.29196: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882410.29232: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882410.29261: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882410.29335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882410.29369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882410.29397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882410.29435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882410.29450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882410.29589: variable 'ansible_distribution' from source: facts 13040 1726882410.29599: variable 'ansible_distribution_major_version' from source: facts 13040 1726882410.29619: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882410.29625: when evaluation is False, skipping this task 13040 1726882410.29630: _execute() done 13040 1726882410.29635: dumping result to json 13040 1726882410.29640: done dumping result, returning 13040 1726882410.29650: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-b123-314b-00000000012a] 13040 1726882410.29660: sending task result for task 0e448fcc-3ce9-b123-314b-00000000012a 13040 1726882410.29765: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000012a 13040 1726882410.29772: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882410.29865: no more pending results, returning what we have 13040 1726882410.29870: results queue empty 13040 1726882410.29870: checking for any_errors_fatal 13040 1726882410.29878: done checking for any_errors_fatal 13040 1726882410.29879: checking for max_fail_percentage 13040 1726882410.29880: done checking for max_fail_percentage 13040 1726882410.29881: checking to see if all hosts have failed and the running result is not ok 13040 1726882410.29882: done checking to see if all hosts have failed 13040 1726882410.29883: getting the remaining hosts for this loop 13040 1726882410.29884: done getting the remaining hosts for this loop 13040 1726882410.29888: getting the next task for host managed_node1 13040 1726882410.29894: done getting next task for host managed_node1 13040 1726882410.29898: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13040 1726882410.29900: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882410.29919: getting variables 13040 1726882410.29921: in VariableManager get_vars() 13040 1726882410.29977: Calling all_inventory to load vars for managed_node1 13040 1726882410.29980: Calling groups_inventory to load vars for managed_node1 13040 1726882410.29982: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882410.29991: Calling all_plugins_play to load vars for managed_node1 13040 1726882410.29993: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882410.29996: Calling groups_plugins_play to load vars for managed_node1 13040 1726882410.30220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882410.30412: done with get_vars() 13040 1726882410.30422: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:33:30 -0400 (0:00:00.040) 0:00:07.782 ****** 13040 1726882410.30498: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 13040 1726882410.30734: worker is 1 (out of 1 available) 13040 1726882410.30746: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 13040 1726882410.30760: done queuing things up, now waiting for results queue to drain 13040 1726882410.30761: waiting for pending results... 13040 1726882410.31023: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13040 1726882410.31153: in run() - task 0e448fcc-3ce9-b123-314b-00000000012b 13040 1726882410.31167: variable 'ansible_search_path' from source: unknown 13040 1726882410.31172: variable 'ansible_search_path' from source: unknown 13040 1726882410.31200: calling self._execute() 13040 1726882410.31268: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882410.31271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882410.31280: variable 'omit' from source: magic vars 13040 1726882410.31587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882410.33162: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882410.33437: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882410.33467: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882410.33493: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882410.33514: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882410.33572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882410.33594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882410.33613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882410.33639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882410.33651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882410.33749: variable 'ansible_distribution' from source: facts 13040 1726882410.33757: variable 'ansible_distribution_major_version' from source: facts 13040 1726882410.33771: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882410.33774: when evaluation is False, skipping this task 13040 1726882410.33777: _execute() done 13040 1726882410.33779: dumping result to json 13040 1726882410.33781: done dumping result, returning 13040 1726882410.33788: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-b123-314b-00000000012b] 13040 1726882410.33794: sending task result for task 0e448fcc-3ce9-b123-314b-00000000012b 13040 1726882410.33890: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000012b 13040 1726882410.33893: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882410.33939: no more pending results, returning what we have 13040 1726882410.33943: results queue empty 13040 1726882410.33944: checking for any_errors_fatal 13040 1726882410.33952: done checking for any_errors_fatal 13040 1726882410.33952: checking for max_fail_percentage 13040 1726882410.33954: done checking for max_fail_percentage 13040 1726882410.33955: checking to see if all hosts have failed and the running result is not ok 13040 1726882410.33956: done checking to see if all hosts have failed 13040 1726882410.33956: getting the remaining hosts for this loop 13040 1726882410.33958: done getting the remaining hosts for this loop 13040 1726882410.33962: getting the next task for host managed_node1 13040 1726882410.33976: done getting next task for host managed_node1 13040 1726882410.33980: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 13040 1726882410.33982: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882410.34000: getting variables 13040 1726882410.34001: in VariableManager get_vars() 13040 1726882410.34050: Calling all_inventory to load vars for managed_node1 13040 1726882410.34053: Calling groups_inventory to load vars for managed_node1 13040 1726882410.34055: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882410.34066: Calling all_plugins_play to load vars for managed_node1 13040 1726882410.34069: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882410.34072: Calling groups_plugins_play to load vars for managed_node1 13040 1726882410.34197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882410.34321: done with get_vars() 13040 1726882410.34329: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:33:30 -0400 (0:00:00.038) 0:00:07.821 ****** 13040 1726882410.34391: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 13040 1726882410.34591: worker is 1 (out of 1 available) 13040 1726882410.34605: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 13040 1726882410.34618: done queuing things up, now waiting for results queue to drain 13040 1726882410.34619: waiting for pending results... 13040 1726882410.34796: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 13040 1726882410.34881: in run() - task 0e448fcc-3ce9-b123-314b-00000000012c 13040 1726882410.34893: variable 'ansible_search_path' from source: unknown 13040 1726882410.34897: variable 'ansible_search_path' from source: unknown 13040 1726882410.34927: calling self._execute() 13040 1726882410.34991: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882410.34994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882410.35003: variable 'omit' from source: magic vars 13040 1726882410.35309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882410.37121: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882410.37169: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882410.37198: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882410.37221: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882410.37241: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882410.37302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882410.37323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882410.37342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882410.37373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882410.37384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882410.37482: variable 'ansible_distribution' from source: facts 13040 1726882410.37487: variable 'ansible_distribution_major_version' from source: facts 13040 1726882410.37503: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882410.37506: when evaluation is False, skipping this task 13040 1726882410.37508: _execute() done 13040 1726882410.37510: dumping result to json 13040 1726882410.37513: done dumping result, returning 13040 1726882410.37522: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-b123-314b-00000000012c] 13040 1726882410.37526: sending task result for task 0e448fcc-3ce9-b123-314b-00000000012c 13040 1726882410.37617: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000012c 13040 1726882410.37619: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882410.37670: no more pending results, returning what we have 13040 1726882410.37674: results queue empty 13040 1726882410.37674: checking for any_errors_fatal 13040 1726882410.37682: done checking for any_errors_fatal 13040 1726882410.37683: checking for max_fail_percentage 13040 1726882410.37685: done checking for max_fail_percentage 13040 1726882410.37686: checking to see if all hosts have failed and the running result is not ok 13040 1726882410.37686: done checking to see if all hosts have failed 13040 1726882410.37687: getting the remaining hosts for this loop 13040 1726882410.37688: done getting the remaining hosts for this loop 13040 1726882410.37692: getting the next task for host managed_node1 13040 1726882410.37698: done getting next task for host managed_node1 13040 1726882410.37702: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13040 1726882410.37704: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882410.37722: getting variables 13040 1726882410.37724: in VariableManager get_vars() 13040 1726882410.37777: Calling all_inventory to load vars for managed_node1 13040 1726882410.37780: Calling groups_inventory to load vars for managed_node1 13040 1726882410.37782: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882410.37790: Calling all_plugins_play to load vars for managed_node1 13040 1726882410.37793: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882410.37795: Calling groups_plugins_play to load vars for managed_node1 13040 1726882410.37965: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882410.38087: done with get_vars() 13040 1726882410.38095: done getting variables 13040 1726882410.38149: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:33:30 -0400 (0:00:00.037) 0:00:07.858 ****** 13040 1726882410.38176: entering _queue_task() for managed_node1/debug 13040 1726882410.38378: worker is 1 (out of 1 available) 13040 1726882410.38392: exiting _queue_task() for managed_node1/debug 13040 1726882410.38403: done queuing things up, now waiting for results queue to drain 13040 1726882410.38404: waiting for pending results... 13040 1726882410.38581: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13040 1726882410.38659: in run() - task 0e448fcc-3ce9-b123-314b-00000000012d 13040 1726882410.38672: variable 'ansible_search_path' from source: unknown 13040 1726882410.38675: variable 'ansible_search_path' from source: unknown 13040 1726882410.38703: calling self._execute() 13040 1726882410.38767: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882410.38770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882410.38779: variable 'omit' from source: magic vars 13040 1726882410.39086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882410.40895: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882410.40943: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882410.40970: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882410.41006: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882410.41027: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882410.41087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882410.41107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882410.41125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882410.41153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882410.41168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882410.41271: variable 'ansible_distribution' from source: facts 13040 1726882410.41276: variable 'ansible_distribution_major_version' from source: facts 13040 1726882410.41291: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882410.41295: when evaluation is False, skipping this task 13040 1726882410.41297: _execute() done 13040 1726882410.41300: dumping result to json 13040 1726882410.41302: done dumping result, returning 13040 1726882410.41309: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-b123-314b-00000000012d] 13040 1726882410.41315: sending task result for task 0e448fcc-3ce9-b123-314b-00000000012d 13040 1726882410.41401: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000012d 13040 1726882410.41404: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 13040 1726882410.41450: no more pending results, returning what we have 13040 1726882410.41453: results queue empty 13040 1726882410.41454: checking for any_errors_fatal 13040 1726882410.41472: done checking for any_errors_fatal 13040 1726882410.41473: checking for max_fail_percentage 13040 1726882410.41475: done checking for max_fail_percentage 13040 1726882410.41475: checking to see if all hosts have failed and the running result is not ok 13040 1726882410.41476: done checking to see if all hosts have failed 13040 1726882410.41477: getting the remaining hosts for this loop 13040 1726882410.41478: done getting the remaining hosts for this loop 13040 1726882410.41483: getting the next task for host managed_node1 13040 1726882410.41489: done getting next task for host managed_node1 13040 1726882410.41493: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13040 1726882410.41496: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882410.41518: getting variables 13040 1726882410.41520: in VariableManager get_vars() 13040 1726882410.41570: Calling all_inventory to load vars for managed_node1 13040 1726882410.41573: Calling groups_inventory to load vars for managed_node1 13040 1726882410.41575: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882410.41584: Calling all_plugins_play to load vars for managed_node1 13040 1726882410.41586: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882410.41589: Calling groups_plugins_play to load vars for managed_node1 13040 1726882410.41716: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882410.41844: done with get_vars() 13040 1726882410.41853: done getting variables 13040 1726882410.41897: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:33:30 -0400 (0:00:00.037) 0:00:07.896 ****** 13040 1726882410.41920: entering _queue_task() for managed_node1/debug 13040 1726882410.42145: worker is 1 (out of 1 available) 13040 1726882410.42157: exiting _queue_task() for managed_node1/debug 13040 1726882410.42172: done queuing things up, now waiting for results queue to drain 13040 1726882410.42173: waiting for pending results... 13040 1726882410.42366: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13040 1726882410.42453: in run() - task 0e448fcc-3ce9-b123-314b-00000000012e 13040 1726882410.42470: variable 'ansible_search_path' from source: unknown 13040 1726882410.42474: variable 'ansible_search_path' from source: unknown 13040 1726882410.42503: calling self._execute() 13040 1726882410.42570: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882410.42573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882410.42582: variable 'omit' from source: magic vars 13040 1726882410.42898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882410.45311: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882410.45401: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882410.45447: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882410.45489: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882410.45526: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882410.45613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882410.45647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882410.45683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882410.45736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882410.45759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882410.45922: variable 'ansible_distribution' from source: facts 13040 1726882410.45945: variable 'ansible_distribution_major_version' from source: facts 13040 1726882410.45973: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882410.45981: when evaluation is False, skipping this task 13040 1726882410.45987: _execute() done 13040 1726882410.45993: dumping result to json 13040 1726882410.45999: done dumping result, returning 13040 1726882410.46010: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-b123-314b-00000000012e] 13040 1726882410.46019: sending task result for task 0e448fcc-3ce9-b123-314b-00000000012e 13040 1726882410.46131: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000012e 13040 1726882410.46137: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 13040 1726882410.46200: no more pending results, returning what we have 13040 1726882410.46204: results queue empty 13040 1726882410.46205: checking for any_errors_fatal 13040 1726882410.46213: done checking for any_errors_fatal 13040 1726882410.46214: checking for max_fail_percentage 13040 1726882410.46216: done checking for max_fail_percentage 13040 1726882410.46217: checking to see if all hosts have failed and the running result is not ok 13040 1726882410.46217: done checking to see if all hosts have failed 13040 1726882410.46218: getting the remaining hosts for this loop 13040 1726882410.46220: done getting the remaining hosts for this loop 13040 1726882410.46224: getting the next task for host managed_node1 13040 1726882410.46231: done getting next task for host managed_node1 13040 1726882410.46236: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13040 1726882410.46239: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882410.46267: getting variables 13040 1726882410.46270: in VariableManager get_vars() 13040 1726882410.46326: Calling all_inventory to load vars for managed_node1 13040 1726882410.46329: Calling groups_inventory to load vars for managed_node1 13040 1726882410.46332: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882410.46343: Calling all_plugins_play to load vars for managed_node1 13040 1726882410.46345: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882410.46348: Calling groups_plugins_play to load vars for managed_node1 13040 1726882410.46587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882410.46852: done with get_vars() 13040 1726882410.46896: done getting variables 13040 1726882410.46954: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:33:30 -0400 (0:00:00.050) 0:00:07.947 ****** 13040 1726882410.46990: entering _queue_task() for managed_node1/debug 13040 1726882410.47271: worker is 1 (out of 1 available) 13040 1726882410.47283: exiting _queue_task() for managed_node1/debug 13040 1726882410.47295: done queuing things up, now waiting for results queue to drain 13040 1726882410.47296: waiting for pending results... 13040 1726882410.47634: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13040 1726882410.47802: in run() - task 0e448fcc-3ce9-b123-314b-00000000012f 13040 1726882410.47831: variable 'ansible_search_path' from source: unknown 13040 1726882410.47850: variable 'ansible_search_path' from source: unknown 13040 1726882410.47913: calling self._execute() 13040 1726882410.48007: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882410.48017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882410.48029: variable 'omit' from source: magic vars 13040 1726882410.48565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882410.51021: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882410.51073: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882410.51113: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882410.51141: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882410.51162: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882410.51219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882410.51242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882410.51261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882410.51289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882410.51300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882410.51401: variable 'ansible_distribution' from source: facts 13040 1726882410.51406: variable 'ansible_distribution_major_version' from source: facts 13040 1726882410.51422: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882410.51425: when evaluation is False, skipping this task 13040 1726882410.51427: _execute() done 13040 1726882410.51429: dumping result to json 13040 1726882410.51431: done dumping result, returning 13040 1726882410.51439: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-b123-314b-00000000012f] 13040 1726882410.51445: sending task result for task 0e448fcc-3ce9-b123-314b-00000000012f 13040 1726882410.51537: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000012f 13040 1726882410.51539: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 13040 1726882410.51619: no more pending results, returning what we have 13040 1726882410.51622: results queue empty 13040 1726882410.51623: checking for any_errors_fatal 13040 1726882410.51629: done checking for any_errors_fatal 13040 1726882410.51630: checking for max_fail_percentage 13040 1726882410.51632: done checking for max_fail_percentage 13040 1726882410.51633: checking to see if all hosts have failed and the running result is not ok 13040 1726882410.51633: done checking to see if all hosts have failed 13040 1726882410.51634: getting the remaining hosts for this loop 13040 1726882410.51635: done getting the remaining hosts for this loop 13040 1726882410.51639: getting the next task for host managed_node1 13040 1726882410.51645: done getting next task for host managed_node1 13040 1726882410.51649: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 13040 1726882410.51652: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882410.51673: getting variables 13040 1726882410.51675: in VariableManager get_vars() 13040 1726882410.51721: Calling all_inventory to load vars for managed_node1 13040 1726882410.51724: Calling groups_inventory to load vars for managed_node1 13040 1726882410.51726: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882410.51734: Calling all_plugins_play to load vars for managed_node1 13040 1726882410.51736: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882410.51739: Calling groups_plugins_play to load vars for managed_node1 13040 1726882410.51855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882410.51984: done with get_vars() 13040 1726882410.51992: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:33:30 -0400 (0:00:00.050) 0:00:07.998 ****** 13040 1726882410.52080: entering _queue_task() for managed_node1/ping 13040 1726882410.52356: worker is 1 (out of 1 available) 13040 1726882410.52372: exiting _queue_task() for managed_node1/ping 13040 1726882410.52387: done queuing things up, now waiting for results queue to drain 13040 1726882410.52389: waiting for pending results... 13040 1726882410.52682: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 13040 1726882410.52814: in run() - task 0e448fcc-3ce9-b123-314b-000000000130 13040 1726882410.52839: variable 'ansible_search_path' from source: unknown 13040 1726882410.52846: variable 'ansible_search_path' from source: unknown 13040 1726882410.52889: calling self._execute() 13040 1726882410.52980: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882410.52991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882410.53006: variable 'omit' from source: magic vars 13040 1726882410.53451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882410.55306: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882410.55354: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882410.55380: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882410.55406: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882410.55426: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882410.55498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882410.55518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882410.55537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882410.55589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882410.55608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882410.55743: variable 'ansible_distribution' from source: facts 13040 1726882410.55754: variable 'ansible_distribution_major_version' from source: facts 13040 1726882410.55780: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882410.55787: when evaluation is False, skipping this task 13040 1726882410.55793: _execute() done 13040 1726882410.55799: dumping result to json 13040 1726882410.55805: done dumping result, returning 13040 1726882410.55815: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-b123-314b-000000000130] 13040 1726882410.55824: sending task result for task 0e448fcc-3ce9-b123-314b-000000000130 13040 1726882410.55927: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000130 13040 1726882410.55934: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882410.56011: no more pending results, returning what we have 13040 1726882410.56015: results queue empty 13040 1726882410.56016: checking for any_errors_fatal 13040 1726882410.56022: done checking for any_errors_fatal 13040 1726882410.56023: checking for max_fail_percentage 13040 1726882410.56025: done checking for max_fail_percentage 13040 1726882410.56025: checking to see if all hosts have failed and the running result is not ok 13040 1726882410.56026: done checking to see if all hosts have failed 13040 1726882410.56027: getting the remaining hosts for this loop 13040 1726882410.56028: done getting the remaining hosts for this loop 13040 1726882410.56032: getting the next task for host managed_node1 13040 1726882410.56039: done getting next task for host managed_node1 13040 1726882410.56042: ^ task is: TASK: meta (role_complete) 13040 1726882410.56044: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882410.56067: getting variables 13040 1726882410.56069: in VariableManager get_vars() 13040 1726882410.56122: Calling all_inventory to load vars for managed_node1 13040 1726882410.56125: Calling groups_inventory to load vars for managed_node1 13040 1726882410.56127: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882410.56137: Calling all_plugins_play to load vars for managed_node1 13040 1726882410.56139: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882410.56142: Calling groups_plugins_play to load vars for managed_node1 13040 1726882410.56687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882410.56882: done with get_vars() 13040 1726882410.56891: done getting variables 13040 1726882410.56968: done queuing things up, now waiting for results queue to drain 13040 1726882410.56971: results queue empty 13040 1726882410.56971: checking for any_errors_fatal 13040 1726882410.56973: done checking for any_errors_fatal 13040 1726882410.56974: checking for max_fail_percentage 13040 1726882410.56975: done checking for max_fail_percentage 13040 1726882410.56976: checking to see if all hosts have failed and the running result is not ok 13040 1726882410.56976: done checking to see if all hosts have failed 13040 1726882410.56977: getting the remaining hosts for this loop 13040 1726882410.56978: done getting the remaining hosts for this loop 13040 1726882410.56980: getting the next task for host managed_node1 13040 1726882410.56984: done getting next task for host managed_node1 13040 1726882410.56986: ^ task is: TASK: From the active connection, get the controller profile "{{ controller_profile }}" 13040 1726882410.56987: ^ state is: HOST STATE: block=2, task=28, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882410.56989: getting variables 13040 1726882410.56990: in VariableManager get_vars() 13040 1726882410.57007: Calling all_inventory to load vars for managed_node1 13040 1726882410.57009: Calling groups_inventory to load vars for managed_node1 13040 1726882410.57011: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882410.57015: Calling all_plugins_play to load vars for managed_node1 13040 1726882410.57017: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882410.57020: Calling groups_plugins_play to load vars for managed_node1 13040 1726882410.57310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882410.57710: done with get_vars() 13040 1726882410.57718: done getting variables 13040 1726882410.57757: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13040 1726882410.57989: variable 'controller_profile' from source: play vars TASK [From the active connection, get the controller profile "bond0"] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:200 Friday 20 September 2024 21:33:30 -0400 (0:00:00.059) 0:00:08.057 ****** 13040 1726882410.58058: entering _queue_task() for managed_node1/command 13040 1726882410.58385: worker is 1 (out of 1 available) 13040 1726882410.58399: exiting _queue_task() for managed_node1/command 13040 1726882410.58411: done queuing things up, now waiting for results queue to drain 13040 1726882410.58413: waiting for pending results... 13040 1726882410.58819: running TaskExecutor() for managed_node1/TASK: From the active connection, get the controller profile "bond0" 13040 1726882410.58943: in run() - task 0e448fcc-3ce9-b123-314b-000000000160 13040 1726882410.58956: variable 'ansible_search_path' from source: unknown 13040 1726882410.59001: calling self._execute() 13040 1726882410.59083: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882410.59091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882410.59100: variable 'omit' from source: magic vars 13040 1726882410.59428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882410.61208: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882410.61281: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882410.61324: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882410.61361: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882410.61396: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882410.61479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882410.61515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882410.61544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882410.61593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882410.61613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882410.61756: variable 'ansible_distribution' from source: facts 13040 1726882410.61772: variable 'ansible_distribution_major_version' from source: facts 13040 1726882410.61794: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882410.61802: when evaluation is False, skipping this task 13040 1726882410.61808: _execute() done 13040 1726882410.61813: dumping result to json 13040 1726882410.61820: done dumping result, returning 13040 1726882410.61829: done running TaskExecutor() for managed_node1/TASK: From the active connection, get the controller profile "bond0" [0e448fcc-3ce9-b123-314b-000000000160] 13040 1726882410.61838: sending task result for task 0e448fcc-3ce9-b123-314b-000000000160 13040 1726882410.61946: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000160 13040 1726882410.61952: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882410.62095: no more pending results, returning what we have 13040 1726882410.62098: results queue empty 13040 1726882410.62099: checking for any_errors_fatal 13040 1726882410.62101: done checking for any_errors_fatal 13040 1726882410.62102: checking for max_fail_percentage 13040 1726882410.62103: done checking for max_fail_percentage 13040 1726882410.62104: checking to see if all hosts have failed and the running result is not ok 13040 1726882410.62105: done checking to see if all hosts have failed 13040 1726882410.62106: getting the remaining hosts for this loop 13040 1726882410.62107: done getting the remaining hosts for this loop 13040 1726882410.62110: getting the next task for host managed_node1 13040 1726882410.62116: done getting next task for host managed_node1 13040 1726882410.62118: ^ task is: TASK: Assert that the controller profile is activated 13040 1726882410.62120: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882410.62124: getting variables 13040 1726882410.62126: in VariableManager get_vars() 13040 1726882410.62185: Calling all_inventory to load vars for managed_node1 13040 1726882410.62188: Calling groups_inventory to load vars for managed_node1 13040 1726882410.62190: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882410.62199: Calling all_plugins_play to load vars for managed_node1 13040 1726882410.62201: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882410.62204: Calling groups_plugins_play to load vars for managed_node1 13040 1726882410.62344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882410.62574: done with get_vars() 13040 1726882410.62584: done getting variables 13040 1726882410.62637: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the controller profile is activated] ************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:207 Friday 20 September 2024 21:33:30 -0400 (0:00:00.046) 0:00:08.104 ****** 13040 1726882410.62689: entering _queue_task() for managed_node1/assert 13040 1726882410.62949: worker is 1 (out of 1 available) 13040 1726882410.62966: exiting _queue_task() for managed_node1/assert 13040 1726882410.62979: done queuing things up, now waiting for results queue to drain 13040 1726882410.62980: waiting for pending results... 13040 1726882410.63261: running TaskExecutor() for managed_node1/TASK: Assert that the controller profile is activated 13040 1726882410.63391: in run() - task 0e448fcc-3ce9-b123-314b-000000000161 13040 1726882410.63425: variable 'ansible_search_path' from source: unknown 13040 1726882410.63514: calling self._execute() 13040 1726882410.63656: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882410.63679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882410.63687: variable 'omit' from source: magic vars 13040 1726882410.64033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882410.65748: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882410.65831: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882410.65878: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882410.65920: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882410.65955: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882410.66047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882410.66083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882410.66118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882410.66162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882410.66184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882410.66334: variable 'ansible_distribution' from source: facts 13040 1726882410.66345: variable 'ansible_distribution_major_version' from source: facts 13040 1726882410.66371: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882410.66380: when evaluation is False, skipping this task 13040 1726882410.66387: _execute() done 13040 1726882410.66394: dumping result to json 13040 1726882410.66401: done dumping result, returning 13040 1726882410.66412: done running TaskExecutor() for managed_node1/TASK: Assert that the controller profile is activated [0e448fcc-3ce9-b123-314b-000000000161] 13040 1726882410.66429: sending task result for task 0e448fcc-3ce9-b123-314b-000000000161 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882410.66585: no more pending results, returning what we have 13040 1726882410.66589: results queue empty 13040 1726882410.66590: checking for any_errors_fatal 13040 1726882410.66596: done checking for any_errors_fatal 13040 1726882410.66597: checking for max_fail_percentage 13040 1726882410.66599: done checking for max_fail_percentage 13040 1726882410.66600: checking to see if all hosts have failed and the running result is not ok 13040 1726882410.66601: done checking to see if all hosts have failed 13040 1726882410.66602: getting the remaining hosts for this loop 13040 1726882410.66603: done getting the remaining hosts for this loop 13040 1726882410.66607: getting the next task for host managed_node1 13040 1726882410.66613: done getting next task for host managed_node1 13040 1726882410.66616: ^ task is: TASK: Get the controller device details 13040 1726882410.66618: ^ state is: HOST STATE: block=2, task=30, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882410.66622: getting variables 13040 1726882410.66624: in VariableManager get_vars() 13040 1726882410.66688: Calling all_inventory to load vars for managed_node1 13040 1726882410.66691: Calling groups_inventory to load vars for managed_node1 13040 1726882410.66693: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882410.66705: Calling all_plugins_play to load vars for managed_node1 13040 1726882410.66708: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882410.66711: Calling groups_plugins_play to load vars for managed_node1 13040 1726882410.66869: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000161 13040 1726882410.66876: WORKER PROCESS EXITING 13040 1726882410.66919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882410.67122: done with get_vars() 13040 1726882410.67134: done getting variables 13040 1726882410.67185: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the controller device details] *************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:214 Friday 20 September 2024 21:33:30 -0400 (0:00:00.045) 0:00:08.149 ****** 13040 1726882410.67207: entering _queue_task() for managed_node1/command 13040 1726882410.67393: worker is 1 (out of 1 available) 13040 1726882410.67406: exiting _queue_task() for managed_node1/command 13040 1726882410.67417: done queuing things up, now waiting for results queue to drain 13040 1726882410.67418: waiting for pending results... 13040 1726882410.67592: running TaskExecutor() for managed_node1/TASK: Get the controller device details 13040 1726882410.67654: in run() - task 0e448fcc-3ce9-b123-314b-000000000162 13040 1726882410.67664: variable 'ansible_search_path' from source: unknown 13040 1726882410.67693: calling self._execute() 13040 1726882410.67762: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882410.67767: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882410.67776: variable 'omit' from source: magic vars 13040 1726882410.68088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882410.69723: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882410.69769: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882410.69796: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882410.69824: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882410.69843: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882410.69903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882410.69926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882410.69943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882410.69972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882410.69982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882410.70083: variable 'ansible_distribution' from source: facts 13040 1726882410.70089: variable 'ansible_distribution_major_version' from source: facts 13040 1726882410.70104: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882410.70106: when evaluation is False, skipping this task 13040 1726882410.70109: _execute() done 13040 1726882410.70111: dumping result to json 13040 1726882410.70115: done dumping result, returning 13040 1726882410.70121: done running TaskExecutor() for managed_node1/TASK: Get the controller device details [0e448fcc-3ce9-b123-314b-000000000162] 13040 1726882410.70127: sending task result for task 0e448fcc-3ce9-b123-314b-000000000162 13040 1726882410.70216: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000162 13040 1726882410.70219: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882410.70291: no more pending results, returning what we have 13040 1726882410.70295: results queue empty 13040 1726882410.70296: checking for any_errors_fatal 13040 1726882410.70303: done checking for any_errors_fatal 13040 1726882410.70303: checking for max_fail_percentage 13040 1726882410.70305: done checking for max_fail_percentage 13040 1726882410.70306: checking to see if all hosts have failed and the running result is not ok 13040 1726882410.70307: done checking to see if all hosts have failed 13040 1726882410.70308: getting the remaining hosts for this loop 13040 1726882410.70309: done getting the remaining hosts for this loop 13040 1726882410.70312: getting the next task for host managed_node1 13040 1726882410.70318: done getting next task for host managed_node1 13040 1726882410.70320: ^ task is: TASK: Assert that the controller profile is activated 13040 1726882410.70322: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882410.70326: getting variables 13040 1726882410.70327: in VariableManager get_vars() 13040 1726882410.70383: Calling all_inventory to load vars for managed_node1 13040 1726882410.70386: Calling groups_inventory to load vars for managed_node1 13040 1726882410.70388: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882410.70396: Calling all_plugins_play to load vars for managed_node1 13040 1726882410.70399: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882410.70401: Calling groups_plugins_play to load vars for managed_node1 13040 1726882410.70572: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882410.70692: done with get_vars() 13040 1726882410.70700: done getting variables 13040 1726882410.70740: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the controller profile is activated] ************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:221 Friday 20 September 2024 21:33:30 -0400 (0:00:00.035) 0:00:08.184 ****** 13040 1726882410.70760: entering _queue_task() for managed_node1/assert 13040 1726882410.70951: worker is 1 (out of 1 available) 13040 1726882410.70966: exiting _queue_task() for managed_node1/assert 13040 1726882410.70980: done queuing things up, now waiting for results queue to drain 13040 1726882410.70982: waiting for pending results... 13040 1726882410.71176: running TaskExecutor() for managed_node1/TASK: Assert that the controller profile is activated 13040 1726882410.71236: in run() - task 0e448fcc-3ce9-b123-314b-000000000163 13040 1726882410.71247: variable 'ansible_search_path' from source: unknown 13040 1726882410.71278: calling self._execute() 13040 1726882410.71347: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882410.71350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882410.71360: variable 'omit' from source: magic vars 13040 1726882410.71699: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882410.73466: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882410.73553: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882410.73598: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882410.73637: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882410.73673: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882410.73758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882410.73801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882410.73832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882410.73884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882410.73905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882410.74046: variable 'ansible_distribution' from source: facts 13040 1726882410.74058: variable 'ansible_distribution_major_version' from source: facts 13040 1726882410.74084: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882410.74091: when evaluation is False, skipping this task 13040 1726882410.74098: _execute() done 13040 1726882410.74104: dumping result to json 13040 1726882410.74113: done dumping result, returning 13040 1726882410.74124: done running TaskExecutor() for managed_node1/TASK: Assert that the controller profile is activated [0e448fcc-3ce9-b123-314b-000000000163] 13040 1726882410.74134: sending task result for task 0e448fcc-3ce9-b123-314b-000000000163 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882410.74292: no more pending results, returning what we have 13040 1726882410.74296: results queue empty 13040 1726882410.74297: checking for any_errors_fatal 13040 1726882410.74302: done checking for any_errors_fatal 13040 1726882410.74303: checking for max_fail_percentage 13040 1726882410.74305: done checking for max_fail_percentage 13040 1726882410.74305: checking to see if all hosts have failed and the running result is not ok 13040 1726882410.74306: done checking to see if all hosts have failed 13040 1726882410.74307: getting the remaining hosts for this loop 13040 1726882410.74308: done getting the remaining hosts for this loop 13040 1726882410.74312: getting the next task for host managed_node1 13040 1726882410.74323: done getting next task for host managed_node1 13040 1726882410.74329: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13040 1726882410.74333: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13040 1726882410.74353: getting variables 13040 1726882410.74356: in VariableManager get_vars() 13040 1726882410.74409: Calling all_inventory to load vars for managed_node1 13040 1726882410.74412: Calling groups_inventory to load vars for managed_node1 13040 1726882410.74414: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882410.74424: Calling all_plugins_play to load vars for managed_node1 13040 1726882410.74427: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882410.74429: Calling groups_plugins_play to load vars for managed_node1 13040 1726882410.74588: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882410.74733: done with get_vars() 13040 1726882410.74742: done getting variables 13040 1726882410.74780: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000163 13040 1726882410.74783: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:33:30 -0400 (0:00:00.040) 0:00:08.225 ****** 13040 1726882410.74828: entering _queue_task() for managed_node1/include_tasks 13040 1726882410.75017: worker is 1 (out of 1 available) 13040 1726882410.75031: exiting _queue_task() for managed_node1/include_tasks 13040 1726882410.75043: done queuing things up, now waiting for results queue to drain 13040 1726882410.75045: waiting for pending results... 13040 1726882410.75224: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13040 1726882410.75326: in run() - task 0e448fcc-3ce9-b123-314b-00000000016c 13040 1726882410.75339: variable 'ansible_search_path' from source: unknown 13040 1726882410.75343: variable 'ansible_search_path' from source: unknown 13040 1726882410.75375: calling self._execute() 13040 1726882410.75441: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882410.75445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882410.75453: variable 'omit' from source: magic vars 13040 1726882410.75758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882410.77407: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882410.77453: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882410.77485: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882410.77511: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882410.77531: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882410.77592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882410.77615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882410.77632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882410.77661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882410.77674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882410.77774: variable 'ansible_distribution' from source: facts 13040 1726882410.77777: variable 'ansible_distribution_major_version' from source: facts 13040 1726882410.77792: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882410.77795: when evaluation is False, skipping this task 13040 1726882410.77798: _execute() done 13040 1726882410.77800: dumping result to json 13040 1726882410.77803: done dumping result, returning 13040 1726882410.77810: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-b123-314b-00000000016c] 13040 1726882410.77815: sending task result for task 0e448fcc-3ce9-b123-314b-00000000016c 13040 1726882410.77903: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000016c 13040 1726882410.77906: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882410.77953: no more pending results, returning what we have 13040 1726882410.77957: results queue empty 13040 1726882410.77958: checking for any_errors_fatal 13040 1726882410.77966: done checking for any_errors_fatal 13040 1726882410.77967: checking for max_fail_percentage 13040 1726882410.77969: done checking for max_fail_percentage 13040 1726882410.77970: checking to see if all hosts have failed and the running result is not ok 13040 1726882410.77971: done checking to see if all hosts have failed 13040 1726882410.77971: getting the remaining hosts for this loop 13040 1726882410.77973: done getting the remaining hosts for this loop 13040 1726882410.77977: getting the next task for host managed_node1 13040 1726882410.77985: done getting next task for host managed_node1 13040 1726882410.77989: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 13040 1726882410.77993: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13040 1726882410.78012: getting variables 13040 1726882410.78013: in VariableManager get_vars() 13040 1726882410.78067: Calling all_inventory to load vars for managed_node1 13040 1726882410.78070: Calling groups_inventory to load vars for managed_node1 13040 1726882410.78072: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882410.78080: Calling all_plugins_play to load vars for managed_node1 13040 1726882410.78082: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882410.78085: Calling groups_plugins_play to load vars for managed_node1 13040 1726882410.78246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882410.78375: done with get_vars() 13040 1726882410.78384: done getting variables 13040 1726882410.78423: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:33:30 -0400 (0:00:00.036) 0:00:08.261 ****** 13040 1726882410.78448: entering _queue_task() for managed_node1/debug 13040 1726882410.78648: worker is 1 (out of 1 available) 13040 1726882410.78661: exiting _queue_task() for managed_node1/debug 13040 1726882410.78676: done queuing things up, now waiting for results queue to drain 13040 1726882410.78677: waiting for pending results... 13040 1726882410.78849: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 13040 1726882410.78959: in run() - task 0e448fcc-3ce9-b123-314b-00000000016d 13040 1726882410.78972: variable 'ansible_search_path' from source: unknown 13040 1726882410.78976: variable 'ansible_search_path' from source: unknown 13040 1726882410.79006: calling self._execute() 13040 1726882410.79077: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882410.79081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882410.79088: variable 'omit' from source: magic vars 13040 1726882410.79396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882410.81015: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882410.81071: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882410.81106: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882410.81131: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882410.81150: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882410.81213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882410.81234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882410.81251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882410.81282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882410.81292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882410.81391: variable 'ansible_distribution' from source: facts 13040 1726882410.81395: variable 'ansible_distribution_major_version' from source: facts 13040 1726882410.81411: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882410.81414: when evaluation is False, skipping this task 13040 1726882410.81417: _execute() done 13040 1726882410.81420: dumping result to json 13040 1726882410.81422: done dumping result, returning 13040 1726882410.81434: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-b123-314b-00000000016d] 13040 1726882410.81727: sending task result for task 0e448fcc-3ce9-b123-314b-00000000016d 13040 1726882410.81800: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000016d 13040 1726882410.81803: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 13040 1726882410.81862: no more pending results, returning what we have 13040 1726882410.81867: results queue empty 13040 1726882410.81868: checking for any_errors_fatal 13040 1726882410.81874: done checking for any_errors_fatal 13040 1726882410.81875: checking for max_fail_percentage 13040 1726882410.81877: done checking for max_fail_percentage 13040 1726882410.81878: checking to see if all hosts have failed and the running result is not ok 13040 1726882410.81879: done checking to see if all hosts have failed 13040 1726882410.81879: getting the remaining hosts for this loop 13040 1726882410.81881: done getting the remaining hosts for this loop 13040 1726882410.81884: getting the next task for host managed_node1 13040 1726882410.81891: done getting next task for host managed_node1 13040 1726882410.81895: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13040 1726882410.81899: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13040 1726882410.81918: getting variables 13040 1726882410.81920: in VariableManager get_vars() 13040 1726882410.81976: Calling all_inventory to load vars for managed_node1 13040 1726882410.81980: Calling groups_inventory to load vars for managed_node1 13040 1726882410.81982: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882410.81991: Calling all_plugins_play to load vars for managed_node1 13040 1726882410.81994: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882410.81996: Calling groups_plugins_play to load vars for managed_node1 13040 1726882410.82177: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882410.82500: done with get_vars() 13040 1726882410.82509: done getting variables 13040 1726882410.82553: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:33:30 -0400 (0:00:00.041) 0:00:08.303 ****** 13040 1726882410.82598: entering _queue_task() for managed_node1/fail 13040 1726882410.82813: worker is 1 (out of 1 available) 13040 1726882410.82825: exiting _queue_task() for managed_node1/fail 13040 1726882410.82837: done queuing things up, now waiting for results queue to drain 13040 1726882410.82839: waiting for pending results... 13040 1726882410.83015: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13040 1726882410.83113: in run() - task 0e448fcc-3ce9-b123-314b-00000000016e 13040 1726882410.83123: variable 'ansible_search_path' from source: unknown 13040 1726882410.83127: variable 'ansible_search_path' from source: unknown 13040 1726882410.83159: calling self._execute() 13040 1726882410.83228: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882410.83232: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882410.83239: variable 'omit' from source: magic vars 13040 1726882410.83550: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882410.85961: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882410.86034: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882410.86080: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882410.86117: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882410.86146: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882410.86228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882410.86280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882410.86324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882410.86413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882410.86462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882410.86619: variable 'ansible_distribution' from source: facts 13040 1726882410.86623: variable 'ansible_distribution_major_version' from source: facts 13040 1726882410.86638: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882410.86641: when evaluation is False, skipping this task 13040 1726882410.86643: _execute() done 13040 1726882410.86645: dumping result to json 13040 1726882410.86649: done dumping result, returning 13040 1726882410.86679: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-b123-314b-00000000016e] 13040 1726882410.86683: sending task result for task 0e448fcc-3ce9-b123-314b-00000000016e 13040 1726882410.86777: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000016e 13040 1726882410.86779: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882410.86822: no more pending results, returning what we have 13040 1726882410.86825: results queue empty 13040 1726882410.86826: checking for any_errors_fatal 13040 1726882410.86832: done checking for any_errors_fatal 13040 1726882410.86833: checking for max_fail_percentage 13040 1726882410.86834: done checking for max_fail_percentage 13040 1726882410.86835: checking to see if all hosts have failed and the running result is not ok 13040 1726882410.86836: done checking to see if all hosts have failed 13040 1726882410.86836: getting the remaining hosts for this loop 13040 1726882410.86838: done getting the remaining hosts for this loop 13040 1726882410.86841: getting the next task for host managed_node1 13040 1726882410.86848: done getting next task for host managed_node1 13040 1726882410.86855: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13040 1726882410.86860: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13040 1726882410.86881: getting variables 13040 1726882410.86883: in VariableManager get_vars() 13040 1726882410.86933: Calling all_inventory to load vars for managed_node1 13040 1726882410.86936: Calling groups_inventory to load vars for managed_node1 13040 1726882410.86939: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882410.86948: Calling all_plugins_play to load vars for managed_node1 13040 1726882410.86950: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882410.86956: Calling groups_plugins_play to load vars for managed_node1 13040 1726882410.87130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882410.87258: done with get_vars() 13040 1726882410.87268: done getting variables 13040 1726882410.87308: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:33:30 -0400 (0:00:00.047) 0:00:08.350 ****** 13040 1726882410.87332: entering _queue_task() for managed_node1/fail 13040 1726882410.87545: worker is 1 (out of 1 available) 13040 1726882410.87560: exiting _queue_task() for managed_node1/fail 13040 1726882410.87574: done queuing things up, now waiting for results queue to drain 13040 1726882410.87576: waiting for pending results... 13040 1726882410.87749: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13040 1726882410.87842: in run() - task 0e448fcc-3ce9-b123-314b-00000000016f 13040 1726882410.87856: variable 'ansible_search_path' from source: unknown 13040 1726882410.87859: variable 'ansible_search_path' from source: unknown 13040 1726882410.87889: calling self._execute() 13040 1726882410.87958: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882410.87962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882410.87971: variable 'omit' from source: magic vars 13040 1726882410.88293: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882410.90431: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882410.90488: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882410.90516: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882410.90543: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882410.90564: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882410.90620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882410.90642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882410.90669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882410.90695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882410.90706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882410.90804: variable 'ansible_distribution' from source: facts 13040 1726882410.90808: variable 'ansible_distribution_major_version' from source: facts 13040 1726882410.90823: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882410.90826: when evaluation is False, skipping this task 13040 1726882410.90829: _execute() done 13040 1726882410.90831: dumping result to json 13040 1726882410.90834: done dumping result, returning 13040 1726882410.90843: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-b123-314b-00000000016f] 13040 1726882410.90848: sending task result for task 0e448fcc-3ce9-b123-314b-00000000016f 13040 1726882410.90936: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000016f 13040 1726882410.90939: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882410.90999: no more pending results, returning what we have 13040 1726882410.91003: results queue empty 13040 1726882410.91004: checking for any_errors_fatal 13040 1726882410.91009: done checking for any_errors_fatal 13040 1726882410.91009: checking for max_fail_percentage 13040 1726882410.91011: done checking for max_fail_percentage 13040 1726882410.91012: checking to see if all hosts have failed and the running result is not ok 13040 1726882410.91013: done checking to see if all hosts have failed 13040 1726882410.91013: getting the remaining hosts for this loop 13040 1726882410.91015: done getting the remaining hosts for this loop 13040 1726882410.91018: getting the next task for host managed_node1 13040 1726882410.91025: done getting next task for host managed_node1 13040 1726882410.91029: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13040 1726882410.91033: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13040 1726882410.91051: getting variables 13040 1726882410.91055: in VariableManager get_vars() 13040 1726882410.91104: Calling all_inventory to load vars for managed_node1 13040 1726882410.91107: Calling groups_inventory to load vars for managed_node1 13040 1726882410.91109: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882410.91117: Calling all_plugins_play to load vars for managed_node1 13040 1726882410.91120: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882410.91122: Calling groups_plugins_play to load vars for managed_node1 13040 1726882410.91240: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882410.91372: done with get_vars() 13040 1726882410.91381: done getting variables 13040 1726882410.91426: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:33:30 -0400 (0:00:00.041) 0:00:08.391 ****** 13040 1726882410.91468: entering _queue_task() for managed_node1/fail 13040 1726882410.91705: worker is 1 (out of 1 available) 13040 1726882410.91717: exiting _queue_task() for managed_node1/fail 13040 1726882410.91727: done queuing things up, now waiting for results queue to drain 13040 1726882410.91728: waiting for pending results... 13040 1726882410.92012: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13040 1726882410.92167: in run() - task 0e448fcc-3ce9-b123-314b-000000000170 13040 1726882410.92188: variable 'ansible_search_path' from source: unknown 13040 1726882410.92197: variable 'ansible_search_path' from source: unknown 13040 1726882410.92235: calling self._execute() 13040 1726882410.92328: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882410.92340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882410.92353: variable 'omit' from source: magic vars 13040 1726882410.92796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882410.94667: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882410.94712: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882410.94740: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882410.94768: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882410.94790: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882410.94843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882410.94867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882410.94886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882410.94913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882410.94924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882410.95018: variable 'ansible_distribution' from source: facts 13040 1726882410.95022: variable 'ansible_distribution_major_version' from source: facts 13040 1726882410.95037: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882410.95040: when evaluation is False, skipping this task 13040 1726882410.95042: _execute() done 13040 1726882410.95044: dumping result to json 13040 1726882410.95047: done dumping result, returning 13040 1726882410.95056: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-b123-314b-000000000170] 13040 1726882410.95059: sending task result for task 0e448fcc-3ce9-b123-314b-000000000170 13040 1726882410.95149: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000170 13040 1726882410.95154: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882410.95280: no more pending results, returning what we have 13040 1726882410.95284: results queue empty 13040 1726882410.95285: checking for any_errors_fatal 13040 1726882410.95291: done checking for any_errors_fatal 13040 1726882410.95292: checking for max_fail_percentage 13040 1726882410.95294: done checking for max_fail_percentage 13040 1726882410.95295: checking to see if all hosts have failed and the running result is not ok 13040 1726882410.95296: done checking to see if all hosts have failed 13040 1726882410.95296: getting the remaining hosts for this loop 13040 1726882410.95298: done getting the remaining hosts for this loop 13040 1726882410.95301: getting the next task for host managed_node1 13040 1726882410.95309: done getting next task for host managed_node1 13040 1726882410.95313: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13040 1726882410.95317: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13040 1726882410.95334: getting variables 13040 1726882410.95335: in VariableManager get_vars() 13040 1726882410.95423: Calling all_inventory to load vars for managed_node1 13040 1726882410.95427: Calling groups_inventory to load vars for managed_node1 13040 1726882410.95429: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882410.95438: Calling all_plugins_play to load vars for managed_node1 13040 1726882410.95441: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882410.95443: Calling groups_plugins_play to load vars for managed_node1 13040 1726882410.95740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882410.95968: done with get_vars() 13040 1726882410.95978: done getting variables 13040 1726882410.96050: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:33:30 -0400 (0:00:00.046) 0:00:08.438 ****** 13040 1726882410.96086: entering _queue_task() for managed_node1/dnf 13040 1726882410.96381: worker is 1 (out of 1 available) 13040 1726882410.96395: exiting _queue_task() for managed_node1/dnf 13040 1726882410.96409: done queuing things up, now waiting for results queue to drain 13040 1726882410.96410: waiting for pending results... 13040 1726882410.96723: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13040 1726882410.96891: in run() - task 0e448fcc-3ce9-b123-314b-000000000171 13040 1726882410.96912: variable 'ansible_search_path' from source: unknown 13040 1726882410.96927: variable 'ansible_search_path' from source: unknown 13040 1726882410.96984: calling self._execute() 13040 1726882410.97098: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882410.97111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882410.97126: variable 'omit' from source: magic vars 13040 1726882410.97856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882411.01175: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882411.01260: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882411.01325: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882411.01443: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882411.01447: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882411.01541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882411.01545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882411.01555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882411.01600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882411.01614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882411.01773: variable 'ansible_distribution' from source: facts 13040 1726882411.01780: variable 'ansible_distribution_major_version' from source: facts 13040 1726882411.01799: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882411.01802: when evaluation is False, skipping this task 13040 1726882411.01804: _execute() done 13040 1726882411.01807: dumping result to json 13040 1726882411.01810: done dumping result, returning 13040 1726882411.01819: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-b123-314b-000000000171] 13040 1726882411.01824: sending task result for task 0e448fcc-3ce9-b123-314b-000000000171 13040 1726882411.01932: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000171 13040 1726882411.01935: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882411.02012: no more pending results, returning what we have 13040 1726882411.02016: results queue empty 13040 1726882411.02017: checking for any_errors_fatal 13040 1726882411.02023: done checking for any_errors_fatal 13040 1726882411.02023: checking for max_fail_percentage 13040 1726882411.02026: done checking for max_fail_percentage 13040 1726882411.02027: checking to see if all hosts have failed and the running result is not ok 13040 1726882411.02027: done checking to see if all hosts have failed 13040 1726882411.02028: getting the remaining hosts for this loop 13040 1726882411.02029: done getting the remaining hosts for this loop 13040 1726882411.02033: getting the next task for host managed_node1 13040 1726882411.02041: done getting next task for host managed_node1 13040 1726882411.02045: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13040 1726882411.02050: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13040 1726882411.02077: getting variables 13040 1726882411.02080: in VariableManager get_vars() 13040 1726882411.02137: Calling all_inventory to load vars for managed_node1 13040 1726882411.02140: Calling groups_inventory to load vars for managed_node1 13040 1726882411.02143: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882411.02157: Calling all_plugins_play to load vars for managed_node1 13040 1726882411.02161: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882411.02166: Calling groups_plugins_play to load vars for managed_node1 13040 1726882411.02357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882411.02589: done with get_vars() 13040 1726882411.02679: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13040 1726882411.02772: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:33:31 -0400 (0:00:00.067) 0:00:08.505 ****** 13040 1726882411.02923: entering _queue_task() for managed_node1/yum 13040 1726882411.03301: worker is 1 (out of 1 available) 13040 1726882411.03312: exiting _queue_task() for managed_node1/yum 13040 1726882411.03324: done queuing things up, now waiting for results queue to drain 13040 1726882411.03325: waiting for pending results... 13040 1726882411.03624: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13040 1726882411.03787: in run() - task 0e448fcc-3ce9-b123-314b-000000000172 13040 1726882411.03813: variable 'ansible_search_path' from source: unknown 13040 1726882411.03822: variable 'ansible_search_path' from source: unknown 13040 1726882411.03868: calling self._execute() 13040 1726882411.04019: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882411.04043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882411.04071: variable 'omit' from source: magic vars 13040 1726882411.04587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882411.07249: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882411.07336: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882411.07385: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882411.07428: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882411.07466: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882411.07557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882411.07597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882411.07635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882411.07686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882411.07713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882411.07870: variable 'ansible_distribution' from source: facts 13040 1726882411.07882: variable 'ansible_distribution_major_version' from source: facts 13040 1726882411.07906: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882411.07919: when evaluation is False, skipping this task 13040 1726882411.07926: _execute() done 13040 1726882411.07937: dumping result to json 13040 1726882411.07946: done dumping result, returning 13040 1726882411.07962: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-b123-314b-000000000172] 13040 1726882411.07975: sending task result for task 0e448fcc-3ce9-b123-314b-000000000172 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882411.08137: no more pending results, returning what we have 13040 1726882411.08141: results queue empty 13040 1726882411.08142: checking for any_errors_fatal 13040 1726882411.08148: done checking for any_errors_fatal 13040 1726882411.08149: checking for max_fail_percentage 13040 1726882411.08151: done checking for max_fail_percentage 13040 1726882411.08154: checking to see if all hosts have failed and the running result is not ok 13040 1726882411.08155: done checking to see if all hosts have failed 13040 1726882411.08156: getting the remaining hosts for this loop 13040 1726882411.08158: done getting the remaining hosts for this loop 13040 1726882411.08162: getting the next task for host managed_node1 13040 1726882411.08171: done getting next task for host managed_node1 13040 1726882411.08176: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13040 1726882411.08181: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13040 1726882411.08201: getting variables 13040 1726882411.08204: in VariableManager get_vars() 13040 1726882411.08271: Calling all_inventory to load vars for managed_node1 13040 1726882411.08274: Calling groups_inventory to load vars for managed_node1 13040 1726882411.08277: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882411.08289: Calling all_plugins_play to load vars for managed_node1 13040 1726882411.08292: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882411.08295: Calling groups_plugins_play to load vars for managed_node1 13040 1726882411.08548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882411.08776: done with get_vars() 13040 1726882411.08788: done getting variables 13040 1726882411.09039: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000172 13040 1726882411.09042: WORKER PROCESS EXITING 13040 1726882411.09087: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:33:31 -0400 (0:00:00.063) 0:00:08.568 ****** 13040 1726882411.09122: entering _queue_task() for managed_node1/fail 13040 1726882411.09481: worker is 1 (out of 1 available) 13040 1726882411.09494: exiting _queue_task() for managed_node1/fail 13040 1726882411.09506: done queuing things up, now waiting for results queue to drain 13040 1726882411.09507: waiting for pending results... 13040 1726882411.09803: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13040 1726882411.09970: in run() - task 0e448fcc-3ce9-b123-314b-000000000173 13040 1726882411.09990: variable 'ansible_search_path' from source: unknown 13040 1726882411.09999: variable 'ansible_search_path' from source: unknown 13040 1726882411.10044: calling self._execute() 13040 1726882411.10141: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882411.10156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882411.10177: variable 'omit' from source: magic vars 13040 1726882411.10649: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882411.13259: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882411.13358: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882411.13410: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882411.13457: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882411.13491: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882411.13585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882411.13621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882411.13665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882411.13712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882411.13733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882411.13894: variable 'ansible_distribution' from source: facts 13040 1726882411.13905: variable 'ansible_distribution_major_version' from source: facts 13040 1726882411.13927: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882411.13935: when evaluation is False, skipping this task 13040 1726882411.13943: _execute() done 13040 1726882411.13949: dumping result to json 13040 1726882411.13964: done dumping result, returning 13040 1726882411.13979: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-b123-314b-000000000173] 13040 1726882411.13994: sending task result for task 0e448fcc-3ce9-b123-314b-000000000173 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882411.14161: no more pending results, returning what we have 13040 1726882411.14167: results queue empty 13040 1726882411.14168: checking for any_errors_fatal 13040 1726882411.14175: done checking for any_errors_fatal 13040 1726882411.14175: checking for max_fail_percentage 13040 1726882411.14177: done checking for max_fail_percentage 13040 1726882411.14178: checking to see if all hosts have failed and the running result is not ok 13040 1726882411.14179: done checking to see if all hosts have failed 13040 1726882411.14180: getting the remaining hosts for this loop 13040 1726882411.14181: done getting the remaining hosts for this loop 13040 1726882411.14185: getting the next task for host managed_node1 13040 1726882411.14193: done getting next task for host managed_node1 13040 1726882411.14197: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 13040 1726882411.14202: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13040 1726882411.14224: getting variables 13040 1726882411.14226: in VariableManager get_vars() 13040 1726882411.14286: Calling all_inventory to load vars for managed_node1 13040 1726882411.14289: Calling groups_inventory to load vars for managed_node1 13040 1726882411.14292: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882411.14303: Calling all_plugins_play to load vars for managed_node1 13040 1726882411.14306: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882411.14309: Calling groups_plugins_play to load vars for managed_node1 13040 1726882411.14502: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882411.14736: done with get_vars() 13040 1726882411.14748: done getting variables 13040 1726882411.14935: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000173 13040 1726882411.14938: WORKER PROCESS EXITING 13040 1726882411.14979: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:33:31 -0400 (0:00:00.058) 0:00:08.627 ****** 13040 1726882411.15021: entering _queue_task() for managed_node1/package 13040 1726882411.15472: worker is 1 (out of 1 available) 13040 1726882411.15485: exiting _queue_task() for managed_node1/package 13040 1726882411.15496: done queuing things up, now waiting for results queue to drain 13040 1726882411.15497: waiting for pending results... 13040 1726882411.15773: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 13040 1726882411.15927: in run() - task 0e448fcc-3ce9-b123-314b-000000000174 13040 1726882411.15949: variable 'ansible_search_path' from source: unknown 13040 1726882411.15958: variable 'ansible_search_path' from source: unknown 13040 1726882411.15997: calling self._execute() 13040 1726882411.16090: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882411.16100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882411.16116: variable 'omit' from source: magic vars 13040 1726882411.16568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882411.19307: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882411.19387: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882411.19433: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882411.19478: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882411.19562: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882411.19720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882411.19894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882411.19925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882411.19980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882411.20093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882411.20349: variable 'ansible_distribution' from source: facts 13040 1726882411.20366: variable 'ansible_distribution_major_version' from source: facts 13040 1726882411.20388: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882411.20409: when evaluation is False, skipping this task 13040 1726882411.20519: _execute() done 13040 1726882411.20526: dumping result to json 13040 1726882411.20533: done dumping result, returning 13040 1726882411.20544: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-b123-314b-000000000174] 13040 1726882411.20556: sending task result for task 0e448fcc-3ce9-b123-314b-000000000174 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882411.20718: no more pending results, returning what we have 13040 1726882411.20722: results queue empty 13040 1726882411.20723: checking for any_errors_fatal 13040 1726882411.20731: done checking for any_errors_fatal 13040 1726882411.20731: checking for max_fail_percentage 13040 1726882411.20733: done checking for max_fail_percentage 13040 1726882411.20734: checking to see if all hosts have failed and the running result is not ok 13040 1726882411.20735: done checking to see if all hosts have failed 13040 1726882411.20736: getting the remaining hosts for this loop 13040 1726882411.20738: done getting the remaining hosts for this loop 13040 1726882411.20742: getting the next task for host managed_node1 13040 1726882411.20749: done getting next task for host managed_node1 13040 1726882411.20756: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13040 1726882411.20760: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13040 1726882411.20784: getting variables 13040 1726882411.20786: in VariableManager get_vars() 13040 1726882411.20840: Calling all_inventory to load vars for managed_node1 13040 1726882411.20844: Calling groups_inventory to load vars for managed_node1 13040 1726882411.20846: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882411.20860: Calling all_plugins_play to load vars for managed_node1 13040 1726882411.20865: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882411.20869: Calling groups_plugins_play to load vars for managed_node1 13040 1726882411.21110: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882411.21331: done with get_vars() 13040 1726882411.21342: done getting variables 13040 1726882411.21404: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:33:31 -0400 (0:00:00.064) 0:00:08.691 ****** 13040 1726882411.21442: entering _queue_task() for managed_node1/package 13040 1726882411.21467: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000174 13040 1726882411.21474: WORKER PROCESS EXITING 13040 1726882411.22308: worker is 1 (out of 1 available) 13040 1726882411.22321: exiting _queue_task() for managed_node1/package 13040 1726882411.22333: done queuing things up, now waiting for results queue to drain 13040 1726882411.22335: waiting for pending results... 13040 1726882411.22708: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13040 1726882411.22861: in run() - task 0e448fcc-3ce9-b123-314b-000000000175 13040 1726882411.22886: variable 'ansible_search_path' from source: unknown 13040 1726882411.22897: variable 'ansible_search_path' from source: unknown 13040 1726882411.22937: calling self._execute() 13040 1726882411.23029: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882411.23040: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882411.23058: variable 'omit' from source: magic vars 13040 1726882411.23508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882411.27527: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882411.27617: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882411.27671: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882411.27712: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882411.27741: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882411.27827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882411.27871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882411.27901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882411.27945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882411.27971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882411.28114: variable 'ansible_distribution' from source: facts 13040 1726882411.28125: variable 'ansible_distribution_major_version' from source: facts 13040 1726882411.28145: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882411.28154: when evaluation is False, skipping this task 13040 1726882411.28162: _execute() done 13040 1726882411.28169: dumping result to json 13040 1726882411.28176: done dumping result, returning 13040 1726882411.28190: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-b123-314b-000000000175] 13040 1726882411.28200: sending task result for task 0e448fcc-3ce9-b123-314b-000000000175 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882411.28356: no more pending results, returning what we have 13040 1726882411.28361: results queue empty 13040 1726882411.28362: checking for any_errors_fatal 13040 1726882411.28371: done checking for any_errors_fatal 13040 1726882411.28372: checking for max_fail_percentage 13040 1726882411.28374: done checking for max_fail_percentage 13040 1726882411.28375: checking to see if all hosts have failed and the running result is not ok 13040 1726882411.28375: done checking to see if all hosts have failed 13040 1726882411.28376: getting the remaining hosts for this loop 13040 1726882411.28378: done getting the remaining hosts for this loop 13040 1726882411.28382: getting the next task for host managed_node1 13040 1726882411.28389: done getting next task for host managed_node1 13040 1726882411.28394: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13040 1726882411.28398: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13040 1726882411.28419: getting variables 13040 1726882411.28421: in VariableManager get_vars() 13040 1726882411.28483: Calling all_inventory to load vars for managed_node1 13040 1726882411.28487: Calling groups_inventory to load vars for managed_node1 13040 1726882411.28489: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882411.28501: Calling all_plugins_play to load vars for managed_node1 13040 1726882411.28504: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882411.28507: Calling groups_plugins_play to load vars for managed_node1 13040 1726882411.28689: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882411.28912: done with get_vars() 13040 1726882411.28925: done getting variables 13040 1726882411.28990: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:33:31 -0400 (0:00:00.075) 0:00:08.767 ****** 13040 1726882411.29028: entering _queue_task() for managed_node1/package 13040 1726882411.29047: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000175 13040 1726882411.29058: WORKER PROCESS EXITING 13040 1726882411.29563: worker is 1 (out of 1 available) 13040 1726882411.29575: exiting _queue_task() for managed_node1/package 13040 1726882411.29587: done queuing things up, now waiting for results queue to drain 13040 1726882411.29588: waiting for pending results... 13040 1726882411.29862: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13040 1726882411.30012: in run() - task 0e448fcc-3ce9-b123-314b-000000000176 13040 1726882411.30033: variable 'ansible_search_path' from source: unknown 13040 1726882411.30040: variable 'ansible_search_path' from source: unknown 13040 1726882411.30084: calling self._execute() 13040 1726882411.30177: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882411.30188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882411.30202: variable 'omit' from source: magic vars 13040 1726882411.30632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882411.33346: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882411.33542: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882411.33646: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882411.33692: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882411.33856: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882411.33942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882411.34032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882411.34189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882411.34235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882411.34259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882411.34634: variable 'ansible_distribution' from source: facts 13040 1726882411.34647: variable 'ansible_distribution_major_version' from source: facts 13040 1726882411.34677: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882411.34685: when evaluation is False, skipping this task 13040 1726882411.34693: _execute() done 13040 1726882411.34699: dumping result to json 13040 1726882411.34711: done dumping result, returning 13040 1726882411.34723: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-b123-314b-000000000176] 13040 1726882411.34734: sending task result for task 0e448fcc-3ce9-b123-314b-000000000176 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882411.34897: no more pending results, returning what we have 13040 1726882411.34902: results queue empty 13040 1726882411.34903: checking for any_errors_fatal 13040 1726882411.34909: done checking for any_errors_fatal 13040 1726882411.34909: checking for max_fail_percentage 13040 1726882411.34912: done checking for max_fail_percentage 13040 1726882411.34912: checking to see if all hosts have failed and the running result is not ok 13040 1726882411.34913: done checking to see if all hosts have failed 13040 1726882411.34914: getting the remaining hosts for this loop 13040 1726882411.34916: done getting the remaining hosts for this loop 13040 1726882411.34920: getting the next task for host managed_node1 13040 1726882411.34928: done getting next task for host managed_node1 13040 1726882411.34932: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13040 1726882411.34937: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13040 1726882411.34960: getting variables 13040 1726882411.34965: in VariableManager get_vars() 13040 1726882411.35020: Calling all_inventory to load vars for managed_node1 13040 1726882411.35023: Calling groups_inventory to load vars for managed_node1 13040 1726882411.35026: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882411.35037: Calling all_plugins_play to load vars for managed_node1 13040 1726882411.35040: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882411.35043: Calling groups_plugins_play to load vars for managed_node1 13040 1726882411.35293: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882411.35518: done with get_vars() 13040 1726882411.35529: done getting variables 13040 1726882411.36019: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000176 13040 1726882411.36023: WORKER PROCESS EXITING 13040 1726882411.36050: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:33:31 -0400 (0:00:00.072) 0:00:08.839 ****** 13040 1726882411.36230: entering _queue_task() for managed_node1/service 13040 1726882411.36769: worker is 1 (out of 1 available) 13040 1726882411.36783: exiting _queue_task() for managed_node1/service 13040 1726882411.36793: done queuing things up, now waiting for results queue to drain 13040 1726882411.36795: waiting for pending results... 13040 1726882411.37096: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13040 1726882411.37255: in run() - task 0e448fcc-3ce9-b123-314b-000000000177 13040 1726882411.37277: variable 'ansible_search_path' from source: unknown 13040 1726882411.37285: variable 'ansible_search_path' from source: unknown 13040 1726882411.37325: calling self._execute() 13040 1726882411.37419: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882411.37430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882411.37444: variable 'omit' from source: magic vars 13040 1726882411.37891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882411.40358: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882411.40447: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882411.40494: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882411.40533: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882411.40568: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882411.40650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882411.40688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882411.40721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882411.40770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882411.40788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882411.40931: variable 'ansible_distribution' from source: facts 13040 1726882411.40944: variable 'ansible_distribution_major_version' from source: facts 13040 1726882411.40969: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882411.40977: when evaluation is False, skipping this task 13040 1726882411.40983: _execute() done 13040 1726882411.40989: dumping result to json 13040 1726882411.40996: done dumping result, returning 13040 1726882411.41008: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-b123-314b-000000000177] 13040 1726882411.41018: sending task result for task 0e448fcc-3ce9-b123-314b-000000000177 13040 1726882411.41140: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000177 13040 1726882411.41147: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882411.41202: no more pending results, returning what we have 13040 1726882411.41206: results queue empty 13040 1726882411.41207: checking for any_errors_fatal 13040 1726882411.41214: done checking for any_errors_fatal 13040 1726882411.41215: checking for max_fail_percentage 13040 1726882411.41216: done checking for max_fail_percentage 13040 1726882411.41217: checking to see if all hosts have failed and the running result is not ok 13040 1726882411.41218: done checking to see if all hosts have failed 13040 1726882411.41219: getting the remaining hosts for this loop 13040 1726882411.41220: done getting the remaining hosts for this loop 13040 1726882411.41224: getting the next task for host managed_node1 13040 1726882411.41232: done getting next task for host managed_node1 13040 1726882411.41236: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13040 1726882411.41240: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13040 1726882411.41267: getting variables 13040 1726882411.41269: in VariableManager get_vars() 13040 1726882411.41325: Calling all_inventory to load vars for managed_node1 13040 1726882411.41328: Calling groups_inventory to load vars for managed_node1 13040 1726882411.41331: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882411.41342: Calling all_plugins_play to load vars for managed_node1 13040 1726882411.41345: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882411.41348: Calling groups_plugins_play to load vars for managed_node1 13040 1726882411.41537: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882411.41758: done with get_vars() 13040 1726882411.41872: done getting variables 13040 1726882411.41933: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:33:31 -0400 (0:00:00.057) 0:00:08.896 ****** 13040 1726882411.41973: entering _queue_task() for managed_node1/service 13040 1726882411.42397: worker is 1 (out of 1 available) 13040 1726882411.42411: exiting _queue_task() for managed_node1/service 13040 1726882411.42423: done queuing things up, now waiting for results queue to drain 13040 1726882411.42424: waiting for pending results... 13040 1726882411.42707: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13040 1726882411.42848: in run() - task 0e448fcc-3ce9-b123-314b-000000000178 13040 1726882411.42876: variable 'ansible_search_path' from source: unknown 13040 1726882411.42885: variable 'ansible_search_path' from source: unknown 13040 1726882411.42923: calling self._execute() 13040 1726882411.43015: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882411.43026: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882411.43039: variable 'omit' from source: magic vars 13040 1726882411.43499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882411.45971: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882411.46047: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882411.46094: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882411.46132: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882411.46170: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882411.46255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882411.46292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882411.46322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882411.46375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882411.46395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882411.46538: variable 'ansible_distribution' from source: facts 13040 1726882411.46549: variable 'ansible_distribution_major_version' from source: facts 13040 1726882411.46577: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882411.46587: when evaluation is False, skipping this task 13040 1726882411.46594: _execute() done 13040 1726882411.46600: dumping result to json 13040 1726882411.46607: done dumping result, returning 13040 1726882411.46617: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-b123-314b-000000000178] 13040 1726882411.46627: sending task result for task 0e448fcc-3ce9-b123-314b-000000000178 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13040 1726882411.46779: no more pending results, returning what we have 13040 1726882411.46783: results queue empty 13040 1726882411.46784: checking for any_errors_fatal 13040 1726882411.46793: done checking for any_errors_fatal 13040 1726882411.46794: checking for max_fail_percentage 13040 1726882411.46796: done checking for max_fail_percentage 13040 1726882411.46797: checking to see if all hosts have failed and the running result is not ok 13040 1726882411.46797: done checking to see if all hosts have failed 13040 1726882411.46798: getting the remaining hosts for this loop 13040 1726882411.46800: done getting the remaining hosts for this loop 13040 1726882411.46803: getting the next task for host managed_node1 13040 1726882411.46811: done getting next task for host managed_node1 13040 1726882411.46815: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13040 1726882411.46820: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13040 1726882411.46841: getting variables 13040 1726882411.46843: in VariableManager get_vars() 13040 1726882411.46901: Calling all_inventory to load vars for managed_node1 13040 1726882411.46904: Calling groups_inventory to load vars for managed_node1 13040 1726882411.46906: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882411.46918: Calling all_plugins_play to load vars for managed_node1 13040 1726882411.46920: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882411.46923: Calling groups_plugins_play to load vars for managed_node1 13040 1726882411.47108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882411.47388: done with get_vars() 13040 1726882411.47399: done getting variables 13040 1726882411.47462: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13040 1726882411.47787: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000178 13040 1726882411.47791: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:33:31 -0400 (0:00:00.058) 0:00:08.955 ****** 13040 1726882411.47804: entering _queue_task() for managed_node1/service 13040 1726882411.48054: worker is 1 (out of 1 available) 13040 1726882411.48069: exiting _queue_task() for managed_node1/service 13040 1726882411.48081: done queuing things up, now waiting for results queue to drain 13040 1726882411.48082: waiting for pending results... 13040 1726882411.48358: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13040 1726882411.48504: in run() - task 0e448fcc-3ce9-b123-314b-000000000179 13040 1726882411.48527: variable 'ansible_search_path' from source: unknown 13040 1726882411.48534: variable 'ansible_search_path' from source: unknown 13040 1726882411.48577: calling self._execute() 13040 1726882411.48671: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882411.48682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882411.48695: variable 'omit' from source: magic vars 13040 1726882411.49148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882411.52635: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882411.53489: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882411.53591: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882411.53692: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882411.53723: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882411.53839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882411.54012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882411.54043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882411.54116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882411.54215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882411.54403: variable 'ansible_distribution' from source: facts 13040 1726882411.54532: variable 'ansible_distribution_major_version' from source: facts 13040 1726882411.54557: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882411.54567: when evaluation is False, skipping this task 13040 1726882411.54575: _execute() done 13040 1726882411.54636: dumping result to json 13040 1726882411.54644: done dumping result, returning 13040 1726882411.54662: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-b123-314b-000000000179] 13040 1726882411.54677: sending task result for task 0e448fcc-3ce9-b123-314b-000000000179 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882411.54833: no more pending results, returning what we have 13040 1726882411.54837: results queue empty 13040 1726882411.54838: checking for any_errors_fatal 13040 1726882411.54846: done checking for any_errors_fatal 13040 1726882411.54847: checking for max_fail_percentage 13040 1726882411.54848: done checking for max_fail_percentage 13040 1726882411.54849: checking to see if all hosts have failed and the running result is not ok 13040 1726882411.54850: done checking to see if all hosts have failed 13040 1726882411.54854: getting the remaining hosts for this loop 13040 1726882411.54855: done getting the remaining hosts for this loop 13040 1726882411.54859: getting the next task for host managed_node1 13040 1726882411.54869: done getting next task for host managed_node1 13040 1726882411.54873: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 13040 1726882411.54878: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13040 1726882411.54899: getting variables 13040 1726882411.54901: in VariableManager get_vars() 13040 1726882411.54958: Calling all_inventory to load vars for managed_node1 13040 1726882411.54961: Calling groups_inventory to load vars for managed_node1 13040 1726882411.54965: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882411.54977: Calling all_plugins_play to load vars for managed_node1 13040 1726882411.54980: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882411.54983: Calling groups_plugins_play to load vars for managed_node1 13040 1726882411.55169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882411.55395: done with get_vars() 13040 1726882411.55407: done getting variables 13040 1726882411.55473: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:33:31 -0400 (0:00:00.077) 0:00:09.032 ****** 13040 1726882411.55510: entering _queue_task() for managed_node1/service 13040 1726882411.55576: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000179 13040 1726882411.55583: WORKER PROCESS EXITING 13040 1726882411.56417: worker is 1 (out of 1 available) 13040 1726882411.56430: exiting _queue_task() for managed_node1/service 13040 1726882411.56443: done queuing things up, now waiting for results queue to drain 13040 1726882411.56444: waiting for pending results... 13040 1726882411.57385: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 13040 1726882411.57617: in run() - task 0e448fcc-3ce9-b123-314b-00000000017a 13040 1726882411.57635: variable 'ansible_search_path' from source: unknown 13040 1726882411.57644: variable 'ansible_search_path' from source: unknown 13040 1726882411.57805: calling self._execute() 13040 1726882411.57900: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882411.58005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882411.58022: variable 'omit' from source: magic vars 13040 1726882411.58918: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882411.62589: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882411.62659: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882411.62709: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882411.62746: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882411.62780: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882411.62862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882411.62898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882411.62930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882411.62981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882411.63000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882411.63258: variable 'ansible_distribution' from source: facts 13040 1726882411.63271: variable 'ansible_distribution_major_version' from source: facts 13040 1726882411.63292: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882411.63299: when evaluation is False, skipping this task 13040 1726882411.63305: _execute() done 13040 1726882411.63311: dumping result to json 13040 1726882411.63317: done dumping result, returning 13040 1726882411.63327: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-b123-314b-00000000017a] 13040 1726882411.63341: sending task result for task 0e448fcc-3ce9-b123-314b-00000000017a skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13040 1726882411.63488: no more pending results, returning what we have 13040 1726882411.63492: results queue empty 13040 1726882411.63493: checking for any_errors_fatal 13040 1726882411.63501: done checking for any_errors_fatal 13040 1726882411.63502: checking for max_fail_percentage 13040 1726882411.63504: done checking for max_fail_percentage 13040 1726882411.63505: checking to see if all hosts have failed and the running result is not ok 13040 1726882411.63505: done checking to see if all hosts have failed 13040 1726882411.63506: getting the remaining hosts for this loop 13040 1726882411.63508: done getting the remaining hosts for this loop 13040 1726882411.63511: getting the next task for host managed_node1 13040 1726882411.63519: done getting next task for host managed_node1 13040 1726882411.63523: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13040 1726882411.63528: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13040 1726882411.63548: getting variables 13040 1726882411.63550: in VariableManager get_vars() 13040 1726882411.63608: Calling all_inventory to load vars for managed_node1 13040 1726882411.63611: Calling groups_inventory to load vars for managed_node1 13040 1726882411.63614: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882411.63625: Calling all_plugins_play to load vars for managed_node1 13040 1726882411.63628: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882411.63631: Calling groups_plugins_play to load vars for managed_node1 13040 1726882411.63869: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882411.64302: done with get_vars() 13040 1726882411.64313: done getting variables 13040 1726882411.64624: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000017a 13040 1726882411.64628: WORKER PROCESS EXITING 13040 1726882411.64667: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:33:31 -0400 (0:00:00.091) 0:00:09.124 ****** 13040 1726882411.64701: entering _queue_task() for managed_node1/copy 13040 1726882411.64940: worker is 1 (out of 1 available) 13040 1726882411.64954: exiting _queue_task() for managed_node1/copy 13040 1726882411.64969: done queuing things up, now waiting for results queue to drain 13040 1726882411.64970: waiting for pending results... 13040 1726882411.65732: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13040 1726882411.65893: in run() - task 0e448fcc-3ce9-b123-314b-00000000017b 13040 1726882411.65912: variable 'ansible_search_path' from source: unknown 13040 1726882411.65918: variable 'ansible_search_path' from source: unknown 13040 1726882411.65963: calling self._execute() 13040 1726882411.66046: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882411.66060: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882411.66079: variable 'omit' from source: magic vars 13040 1726882411.66508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882411.69914: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882411.69991: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882411.70034: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882411.70089: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882411.70125: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882411.70210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882411.70246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882411.70281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882411.70329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882411.70349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882411.70495: variable 'ansible_distribution' from source: facts 13040 1726882411.70505: variable 'ansible_distribution_major_version' from source: facts 13040 1726882411.70525: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882411.70532: when evaluation is False, skipping this task 13040 1726882411.70544: _execute() done 13040 1726882411.70550: dumping result to json 13040 1726882411.70560: done dumping result, returning 13040 1726882411.70576: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-b123-314b-00000000017b] 13040 1726882411.70587: sending task result for task 0e448fcc-3ce9-b123-314b-00000000017b skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882411.70744: no more pending results, returning what we have 13040 1726882411.70748: results queue empty 13040 1726882411.70749: checking for any_errors_fatal 13040 1726882411.70759: done checking for any_errors_fatal 13040 1726882411.70760: checking for max_fail_percentage 13040 1726882411.70762: done checking for max_fail_percentage 13040 1726882411.70765: checking to see if all hosts have failed and the running result is not ok 13040 1726882411.70766: done checking to see if all hosts have failed 13040 1726882411.70767: getting the remaining hosts for this loop 13040 1726882411.70768: done getting the remaining hosts for this loop 13040 1726882411.70772: getting the next task for host managed_node1 13040 1726882411.70779: done getting next task for host managed_node1 13040 1726882411.70784: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13040 1726882411.70788: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13040 1726882411.70809: getting variables 13040 1726882411.70811: in VariableManager get_vars() 13040 1726882411.70872: Calling all_inventory to load vars for managed_node1 13040 1726882411.70876: Calling groups_inventory to load vars for managed_node1 13040 1726882411.70878: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882411.70889: Calling all_plugins_play to load vars for managed_node1 13040 1726882411.70892: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882411.70895: Calling groups_plugins_play to load vars for managed_node1 13040 1726882411.71080: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882411.71372: done with get_vars() 13040 1726882411.71384: done getting variables 13040 1726882411.71657: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000017b 13040 1726882411.71660: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:33:31 -0400 (0:00:00.069) 0:00:09.193 ****** 13040 1726882411.71677: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 13040 1726882411.72045: worker is 1 (out of 1 available) 13040 1726882411.72059: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 13040 1726882411.72076: done queuing things up, now waiting for results queue to drain 13040 1726882411.72077: waiting for pending results... 13040 1726882411.72350: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13040 1726882411.72496: in run() - task 0e448fcc-3ce9-b123-314b-00000000017c 13040 1726882411.72519: variable 'ansible_search_path' from source: unknown 13040 1726882411.72527: variable 'ansible_search_path' from source: unknown 13040 1726882411.72571: calling self._execute() 13040 1726882411.72662: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882411.72676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882411.72690: variable 'omit' from source: magic vars 13040 1726882411.73138: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882411.77255: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882411.77361: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882411.77422: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882411.77463: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882411.77496: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882411.77582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882411.77616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882411.77651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882411.77701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882411.77719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882411.77874: variable 'ansible_distribution' from source: facts 13040 1726882411.77885: variable 'ansible_distribution_major_version' from source: facts 13040 1726882411.77907: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882411.77916: when evaluation is False, skipping this task 13040 1726882411.77923: _execute() done 13040 1726882411.77929: dumping result to json 13040 1726882411.77935: done dumping result, returning 13040 1726882411.77946: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-b123-314b-00000000017c] 13040 1726882411.77963: sending task result for task 0e448fcc-3ce9-b123-314b-00000000017c skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882411.78132: no more pending results, returning what we have 13040 1726882411.78137: results queue empty 13040 1726882411.78138: checking for any_errors_fatal 13040 1726882411.78144: done checking for any_errors_fatal 13040 1726882411.78145: checking for max_fail_percentage 13040 1726882411.78147: done checking for max_fail_percentage 13040 1726882411.78148: checking to see if all hosts have failed and the running result is not ok 13040 1726882411.78149: done checking to see if all hosts have failed 13040 1726882411.78150: getting the remaining hosts for this loop 13040 1726882411.78154: done getting the remaining hosts for this loop 13040 1726882411.78158: getting the next task for host managed_node1 13040 1726882411.78170: done getting next task for host managed_node1 13040 1726882411.78174: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 13040 1726882411.78178: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13040 1726882411.78198: getting variables 13040 1726882411.78200: in VariableManager get_vars() 13040 1726882411.78257: Calling all_inventory to load vars for managed_node1 13040 1726882411.78262: Calling groups_inventory to load vars for managed_node1 13040 1726882411.78267: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882411.78278: Calling all_plugins_play to load vars for managed_node1 13040 1726882411.78281: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882411.78284: Calling groups_plugins_play to load vars for managed_node1 13040 1726882411.78522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882411.78737: done with get_vars() 13040 1726882411.78748: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:33:31 -0400 (0:00:00.071) 0:00:09.265 ****** 13040 1726882411.78842: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 13040 1726882411.78867: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000017c 13040 1726882411.78877: WORKER PROCESS EXITING 13040 1726882411.79428: worker is 1 (out of 1 available) 13040 1726882411.79439: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 13040 1726882411.79455: done queuing things up, now waiting for results queue to drain 13040 1726882411.79456: waiting for pending results... 13040 1726882411.80080: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 13040 1726882411.80297: in run() - task 0e448fcc-3ce9-b123-314b-00000000017d 13040 1726882411.80393: variable 'ansible_search_path' from source: unknown 13040 1726882411.80447: variable 'ansible_search_path' from source: unknown 13040 1726882411.80492: calling self._execute() 13040 1726882411.80644: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882411.80734: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882411.80749: variable 'omit' from source: magic vars 13040 1726882411.81472: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882411.84243: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882411.84323: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882411.84388: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882411.84425: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882411.84459: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882411.84542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882411.84584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882411.84614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882411.84665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882411.84690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882411.84836: variable 'ansible_distribution' from source: facts 13040 1726882411.84847: variable 'ansible_distribution_major_version' from source: facts 13040 1726882411.84874: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882411.84881: when evaluation is False, skipping this task 13040 1726882411.84888: _execute() done 13040 1726882411.84893: dumping result to json 13040 1726882411.84901: done dumping result, returning 13040 1726882411.84915: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-b123-314b-00000000017d] 13040 1726882411.84925: sending task result for task 0e448fcc-3ce9-b123-314b-00000000017d skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882411.85083: no more pending results, returning what we have 13040 1726882411.85087: results queue empty 13040 1726882411.85088: checking for any_errors_fatal 13040 1726882411.85096: done checking for any_errors_fatal 13040 1726882411.85097: checking for max_fail_percentage 13040 1726882411.85099: done checking for max_fail_percentage 13040 1726882411.85100: checking to see if all hosts have failed and the running result is not ok 13040 1726882411.85101: done checking to see if all hosts have failed 13040 1726882411.85102: getting the remaining hosts for this loop 13040 1726882411.85103: done getting the remaining hosts for this loop 13040 1726882411.85108: getting the next task for host managed_node1 13040 1726882411.85115: done getting next task for host managed_node1 13040 1726882411.85120: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13040 1726882411.85124: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13040 1726882411.85144: getting variables 13040 1726882411.85146: in VariableManager get_vars() 13040 1726882411.85211: Calling all_inventory to load vars for managed_node1 13040 1726882411.85214: Calling groups_inventory to load vars for managed_node1 13040 1726882411.85216: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882411.85227: Calling all_plugins_play to load vars for managed_node1 13040 1726882411.85230: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882411.85233: Calling groups_plugins_play to load vars for managed_node1 13040 1726882411.85419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882411.85643: done with get_vars() 13040 1726882411.85657: done getting variables 13040 1726882411.85895: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000017d 13040 1726882411.85898: WORKER PROCESS EXITING 13040 1726882411.85933: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:33:31 -0400 (0:00:00.071) 0:00:09.336 ****** 13040 1726882411.85974: entering _queue_task() for managed_node1/debug 13040 1726882411.86331: worker is 1 (out of 1 available) 13040 1726882411.86343: exiting _queue_task() for managed_node1/debug 13040 1726882411.86358: done queuing things up, now waiting for results queue to drain 13040 1726882411.86360: waiting for pending results... 13040 1726882411.86645: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13040 1726882411.86788: in run() - task 0e448fcc-3ce9-b123-314b-00000000017e 13040 1726882411.86812: variable 'ansible_search_path' from source: unknown 13040 1726882411.86821: variable 'ansible_search_path' from source: unknown 13040 1726882411.86868: calling self._execute() 13040 1726882411.86965: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882411.86978: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882411.86992: variable 'omit' from source: magic vars 13040 1726882411.87457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882411.90104: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882411.90184: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882411.90227: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882411.90271: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882411.90306: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882411.90388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882411.90426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882411.90459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882411.90510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882411.90532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882411.90683: variable 'ansible_distribution' from source: facts 13040 1726882411.90694: variable 'ansible_distribution_major_version' from source: facts 13040 1726882411.90715: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882411.90723: when evaluation is False, skipping this task 13040 1726882411.90734: _execute() done 13040 1726882411.90741: dumping result to json 13040 1726882411.90748: done dumping result, returning 13040 1726882411.90762: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-b123-314b-00000000017e] 13040 1726882411.90774: sending task result for task 0e448fcc-3ce9-b123-314b-00000000017e skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 13040 1726882411.90918: no more pending results, returning what we have 13040 1726882411.90923: results queue empty 13040 1726882411.90924: checking for any_errors_fatal 13040 1726882411.90930: done checking for any_errors_fatal 13040 1726882411.90931: checking for max_fail_percentage 13040 1726882411.90933: done checking for max_fail_percentage 13040 1726882411.90934: checking to see if all hosts have failed and the running result is not ok 13040 1726882411.90934: done checking to see if all hosts have failed 13040 1726882411.90936: getting the remaining hosts for this loop 13040 1726882411.90937: done getting the remaining hosts for this loop 13040 1726882411.90940: getting the next task for host managed_node1 13040 1726882411.90949: done getting next task for host managed_node1 13040 1726882411.90956: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13040 1726882411.90960: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13040 1726882411.90982: getting variables 13040 1726882411.90985: in VariableManager get_vars() 13040 1726882411.91039: Calling all_inventory to load vars for managed_node1 13040 1726882411.91042: Calling groups_inventory to load vars for managed_node1 13040 1726882411.91045: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882411.91058: Calling all_plugins_play to load vars for managed_node1 13040 1726882411.91062: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882411.91068: Calling groups_plugins_play to load vars for managed_node1 13040 1726882411.91250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882411.91958: done with get_vars() 13040 1726882411.91970: done getting variables 13040 1726882411.92102: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000017e 13040 1726882411.92106: WORKER PROCESS EXITING 13040 1726882411.92140: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:33:31 -0400 (0:00:00.062) 0:00:09.398 ****** 13040 1726882411.92179: entering _queue_task() for managed_node1/debug 13040 1726882411.92438: worker is 1 (out of 1 available) 13040 1726882411.92450: exiting _queue_task() for managed_node1/debug 13040 1726882411.92466: done queuing things up, now waiting for results queue to drain 13040 1726882411.92468: waiting for pending results... 13040 1726882411.92748: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13040 1726882411.92905: in run() - task 0e448fcc-3ce9-b123-314b-00000000017f 13040 1726882411.92926: variable 'ansible_search_path' from source: unknown 13040 1726882411.92934: variable 'ansible_search_path' from source: unknown 13040 1726882411.92978: calling self._execute() 13040 1726882411.93078: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882411.93092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882411.93108: variable 'omit' from source: magic vars 13040 1726882411.93569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882411.96267: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882411.96358: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882411.96405: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882411.96447: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882411.96484: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882411.96684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882411.96719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882411.96757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882411.96895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882411.96916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882411.97286: variable 'ansible_distribution' from source: facts 13040 1726882411.97297: variable 'ansible_distribution_major_version' from source: facts 13040 1726882411.97318: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882411.97325: when evaluation is False, skipping this task 13040 1726882411.97333: _execute() done 13040 1726882411.97339: dumping result to json 13040 1726882411.97346: done dumping result, returning 13040 1726882411.97361: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-b123-314b-00000000017f] 13040 1726882411.97374: sending task result for task 0e448fcc-3ce9-b123-314b-00000000017f skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 13040 1726882411.97536: no more pending results, returning what we have 13040 1726882411.97540: results queue empty 13040 1726882411.97541: checking for any_errors_fatal 13040 1726882411.97550: done checking for any_errors_fatal 13040 1726882411.97554: checking for max_fail_percentage 13040 1726882411.97556: done checking for max_fail_percentage 13040 1726882411.97557: checking to see if all hosts have failed and the running result is not ok 13040 1726882411.97557: done checking to see if all hosts have failed 13040 1726882411.97559: getting the remaining hosts for this loop 13040 1726882411.97560: done getting the remaining hosts for this loop 13040 1726882411.97566: getting the next task for host managed_node1 13040 1726882411.97575: done getting next task for host managed_node1 13040 1726882411.97579: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13040 1726882411.97584: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13040 1726882411.97605: getting variables 13040 1726882411.97607: in VariableManager get_vars() 13040 1726882411.97669: Calling all_inventory to load vars for managed_node1 13040 1726882411.97672: Calling groups_inventory to load vars for managed_node1 13040 1726882411.97675: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882411.97686: Calling all_plugins_play to load vars for managed_node1 13040 1726882411.97689: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882411.97692: Calling groups_plugins_play to load vars for managed_node1 13040 1726882411.97883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882411.98111: done with get_vars() 13040 1726882411.98123: done getting variables 13040 1726882411.98188: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:33:31 -0400 (0:00:00.060) 0:00:09.459 ****** 13040 1726882411.98226: entering _queue_task() for managed_node1/debug 13040 1726882411.98275: done sending task result for task 0e448fcc-3ce9-b123-314b-00000000017f 13040 1726882411.98283: WORKER PROCESS EXITING 13040 1726882411.99173: worker is 1 (out of 1 available) 13040 1726882411.99185: exiting _queue_task() for managed_node1/debug 13040 1726882411.99197: done queuing things up, now waiting for results queue to drain 13040 1726882411.99198: waiting for pending results... 13040 1726882411.99668: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13040 1726882411.99830: in run() - task 0e448fcc-3ce9-b123-314b-000000000180 13040 1726882411.99849: variable 'ansible_search_path' from source: unknown 13040 1726882411.99860: variable 'ansible_search_path' from source: unknown 13040 1726882411.99909: calling self._execute() 13040 1726882412.00009: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882412.00021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882412.00035: variable 'omit' from source: magic vars 13040 1726882412.00499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882412.03157: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882412.03236: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882412.03287: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882412.03326: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882412.03360: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882412.03442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882412.03482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882412.03516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882412.03562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882412.03584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882412.03730: variable 'ansible_distribution' from source: facts 13040 1726882412.03740: variable 'ansible_distribution_major_version' from source: facts 13040 1726882412.03766: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882412.03773: when evaluation is False, skipping this task 13040 1726882412.03779: _execute() done 13040 1726882412.03785: dumping result to json 13040 1726882412.03791: done dumping result, returning 13040 1726882412.03802: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-b123-314b-000000000180] 13040 1726882412.03811: sending task result for task 0e448fcc-3ce9-b123-314b-000000000180 13040 1726882412.03922: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000180 skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 13040 1726882412.03975: no more pending results, returning what we have 13040 1726882412.03979: results queue empty 13040 1726882412.03980: checking for any_errors_fatal 13040 1726882412.03985: done checking for any_errors_fatal 13040 1726882412.03986: checking for max_fail_percentage 13040 1726882412.03988: done checking for max_fail_percentage 13040 1726882412.03989: checking to see if all hosts have failed and the running result is not ok 13040 1726882412.03990: done checking to see if all hosts have failed 13040 1726882412.03991: getting the remaining hosts for this loop 13040 1726882412.03992: done getting the remaining hosts for this loop 13040 1726882412.03996: getting the next task for host managed_node1 13040 1726882412.04004: done getting next task for host managed_node1 13040 1726882412.04009: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 13040 1726882412.04014: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13040 1726882412.04035: getting variables 13040 1726882412.04038: in VariableManager get_vars() 13040 1726882412.04100: Calling all_inventory to load vars for managed_node1 13040 1726882412.04103: Calling groups_inventory to load vars for managed_node1 13040 1726882412.04106: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882412.04117: Calling all_plugins_play to load vars for managed_node1 13040 1726882412.04120: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882412.04124: Calling groups_plugins_play to load vars for managed_node1 13040 1726882412.04378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882412.04598: done with get_vars() 13040 1726882412.04609: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:33:32 -0400 (0:00:00.064) 0:00:09.524 ****** 13040 1726882412.04717: entering _queue_task() for managed_node1/ping 13040 1726882412.04737: WORKER PROCESS EXITING 13040 1726882412.05250: worker is 1 (out of 1 available) 13040 1726882412.05267: exiting _queue_task() for managed_node1/ping 13040 1726882412.05279: done queuing things up, now waiting for results queue to drain 13040 1726882412.05280: waiting for pending results... 13040 1726882412.05566: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 13040 1726882412.05717: in run() - task 0e448fcc-3ce9-b123-314b-000000000181 13040 1726882412.05738: variable 'ansible_search_path' from source: unknown 13040 1726882412.05745: variable 'ansible_search_path' from source: unknown 13040 1726882412.05789: calling self._execute() 13040 1726882412.05891: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882412.05901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882412.05914: variable 'omit' from source: magic vars 13040 1726882412.06356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882412.08831: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882412.08917: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882412.08960: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882412.09000: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882412.09026: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882412.09104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882412.09134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882412.09165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882412.09220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882412.09241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882412.09390: variable 'ansible_distribution' from source: facts 13040 1726882412.09401: variable 'ansible_distribution_major_version' from source: facts 13040 1726882412.09427: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882412.09436: when evaluation is False, skipping this task 13040 1726882412.09444: _execute() done 13040 1726882412.09450: dumping result to json 13040 1726882412.09462: done dumping result, returning 13040 1726882412.09478: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-b123-314b-000000000181] 13040 1726882412.09488: sending task result for task 0e448fcc-3ce9-b123-314b-000000000181 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882412.09635: no more pending results, returning what we have 13040 1726882412.09639: results queue empty 13040 1726882412.09641: checking for any_errors_fatal 13040 1726882412.09646: done checking for any_errors_fatal 13040 1726882412.09647: checking for max_fail_percentage 13040 1726882412.09649: done checking for max_fail_percentage 13040 1726882412.09650: checking to see if all hosts have failed and the running result is not ok 13040 1726882412.09651: done checking to see if all hosts have failed 13040 1726882412.09655: getting the remaining hosts for this loop 13040 1726882412.09656: done getting the remaining hosts for this loop 13040 1726882412.09660: getting the next task for host managed_node1 13040 1726882412.09673: done getting next task for host managed_node1 13040 1726882412.09676: ^ task is: TASK: meta (role_complete) 13040 1726882412.09680: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13040 1726882412.09703: getting variables 13040 1726882412.09706: in VariableManager get_vars() 13040 1726882412.09766: Calling all_inventory to load vars for managed_node1 13040 1726882412.09769: Calling groups_inventory to load vars for managed_node1 13040 1726882412.09772: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882412.09784: Calling all_plugins_play to load vars for managed_node1 13040 1726882412.09787: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882412.09791: Calling groups_plugins_play to load vars for managed_node1 13040 1726882412.09982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882412.10212: done with get_vars() 13040 1726882412.10223: done getting variables 13040 1726882412.10311: done queuing things up, now waiting for results queue to drain 13040 1726882412.10313: results queue empty 13040 1726882412.10314: checking for any_errors_fatal 13040 1726882412.10315: done checking for any_errors_fatal 13040 1726882412.10316: checking for max_fail_percentage 13040 1726882412.10317: done checking for max_fail_percentage 13040 1726882412.10318: checking to see if all hosts have failed and the running result is not ok 13040 1726882412.10318: done checking to see if all hosts have failed 13040 1726882412.10319: getting the remaining hosts for this loop 13040 1726882412.10320: done getting the remaining hosts for this loop 13040 1726882412.10322: getting the next task for host managed_node1 13040 1726882412.10326: done getting next task for host managed_node1 13040 1726882412.10328: ^ task is: TASK: Delete the device '{{ controller_device }}' 13040 1726882412.10330: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13040 1726882412.10332: getting variables 13040 1726882412.10333: in VariableManager get_vars() 13040 1726882412.10355: Calling all_inventory to load vars for managed_node1 13040 1726882412.10357: Calling groups_inventory to load vars for managed_node1 13040 1726882412.10778: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882412.10786: done sending task result for task 0e448fcc-3ce9-b123-314b-000000000181 13040 1726882412.10789: WORKER PROCESS EXITING 13040 1726882412.10794: Calling all_plugins_play to load vars for managed_node1 13040 1726882412.10796: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882412.10799: Calling groups_plugins_play to load vars for managed_node1 13040 1726882412.11021: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882412.11455: done with get_vars() 13040 1726882412.11468: done getting variables 13040 1726882412.11510: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13040 1726882412.11641: variable 'controller_device' from source: play vars TASK [Delete the device 'nm-bond'] ********************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:242 Friday 20 September 2024 21:33:32 -0400 (0:00:00.069) 0:00:09.593 ****** 13040 1726882412.11679: entering _queue_task() for managed_node1/command 13040 1726882412.11979: worker is 1 (out of 1 available) 13040 1726882412.11992: exiting _queue_task() for managed_node1/command 13040 1726882412.12004: done queuing things up, now waiting for results queue to drain 13040 1726882412.12005: waiting for pending results... 13040 1726882412.12307: running TaskExecutor() for managed_node1/TASK: Delete the device 'nm-bond' 13040 1726882412.12432: in run() - task 0e448fcc-3ce9-b123-314b-0000000001b1 13040 1726882412.12454: variable 'ansible_search_path' from source: unknown 13040 1726882412.12494: calling self._execute() 13040 1726882412.12589: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882412.12598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882412.12609: variable 'omit' from source: magic vars 13040 1726882412.13086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882412.17215: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882412.17307: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882412.17355: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882412.17396: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882412.17426: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882412.17582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882412.17616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882412.17645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882412.17731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882412.17785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882412.17958: variable 'ansible_distribution' from source: facts 13040 1726882412.17973: variable 'ansible_distribution_major_version' from source: facts 13040 1726882412.18017: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882412.18025: when evaluation is False, skipping this task 13040 1726882412.18032: _execute() done 13040 1726882412.18038: dumping result to json 13040 1726882412.18045: done dumping result, returning 13040 1726882412.18059: done running TaskExecutor() for managed_node1/TASK: Delete the device 'nm-bond' [0e448fcc-3ce9-b123-314b-0000000001b1] 13040 1726882412.18077: sending task result for task 0e448fcc-3ce9-b123-314b-0000000001b1 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882412.18271: no more pending results, returning what we have 13040 1726882412.18275: results queue empty 13040 1726882412.18276: checking for any_errors_fatal 13040 1726882412.18278: done checking for any_errors_fatal 13040 1726882412.18279: checking for max_fail_percentage 13040 1726882412.18281: done checking for max_fail_percentage 13040 1726882412.18282: checking to see if all hosts have failed and the running result is not ok 13040 1726882412.18283: done checking to see if all hosts have failed 13040 1726882412.18287: getting the remaining hosts for this loop 13040 1726882412.18289: done getting the remaining hosts for this loop 13040 1726882412.18293: getting the next task for host managed_node1 13040 1726882412.18302: done getting next task for host managed_node1 13040 1726882412.18304: ^ task is: TASK: Remove test interfaces 13040 1726882412.18309: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13040 1726882412.18314: getting variables 13040 1726882412.18316: in VariableManager get_vars() 13040 1726882412.18377: Calling all_inventory to load vars for managed_node1 13040 1726882412.18380: Calling groups_inventory to load vars for managed_node1 13040 1726882412.18383: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882412.18395: Calling all_plugins_play to load vars for managed_node1 13040 1726882412.18398: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882412.18401: Calling groups_plugins_play to load vars for managed_node1 13040 1726882412.18589: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882412.18809: done with get_vars() 13040 1726882412.18821: done getting variables 13040 1726882412.19166: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000001b1 13040 1726882412.19170: WORKER PROCESS EXITING 13040 1726882412.19188: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 21:33:32 -0400 (0:00:00.075) 0:00:09.669 ****** 13040 1726882412.19220: entering _queue_task() for managed_node1/shell 13040 1726882412.19473: worker is 1 (out of 1 available) 13040 1726882412.19486: exiting _queue_task() for managed_node1/shell 13040 1726882412.19498: done queuing things up, now waiting for results queue to drain 13040 1726882412.19499: waiting for pending results... 13040 1726882412.19784: running TaskExecutor() for managed_node1/TASK: Remove test interfaces 13040 1726882412.20137: in run() - task 0e448fcc-3ce9-b123-314b-0000000001b5 13040 1726882412.20156: variable 'ansible_search_path' from source: unknown 13040 1726882412.20163: variable 'ansible_search_path' from source: unknown 13040 1726882412.20204: calling self._execute() 13040 1726882412.20298: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882412.20309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882412.20326: variable 'omit' from source: magic vars 13040 1726882412.20833: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882412.23592: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882412.23674: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882412.23723: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882412.23767: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882412.23801: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882412.23941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882412.23999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882412.24026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882412.24757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882412.24840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882412.25098: variable 'ansible_distribution' from source: facts 13040 1726882412.25161: variable 'ansible_distribution_major_version' from source: facts 13040 1726882412.25188: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882412.25266: when evaluation is False, skipping this task 13040 1726882412.25273: _execute() done 13040 1726882412.25279: dumping result to json 13040 1726882412.25286: done dumping result, returning 13040 1726882412.25296: done running TaskExecutor() for managed_node1/TASK: Remove test interfaces [0e448fcc-3ce9-b123-314b-0000000001b5] 13040 1726882412.25305: sending task result for task 0e448fcc-3ce9-b123-314b-0000000001b5 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882412.25450: no more pending results, returning what we have 13040 1726882412.25457: results queue empty 13040 1726882412.25458: checking for any_errors_fatal 13040 1726882412.25470: done checking for any_errors_fatal 13040 1726882412.25471: checking for max_fail_percentage 13040 1726882412.25473: done checking for max_fail_percentage 13040 1726882412.25473: checking to see if all hosts have failed and the running result is not ok 13040 1726882412.25474: done checking to see if all hosts have failed 13040 1726882412.25475: getting the remaining hosts for this loop 13040 1726882412.25477: done getting the remaining hosts for this loop 13040 1726882412.25481: getting the next task for host managed_node1 13040 1726882412.25489: done getting next task for host managed_node1 13040 1726882412.25492: ^ task is: TASK: Stop dnsmasq/radvd services 13040 1726882412.25497: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13040 1726882412.25501: getting variables 13040 1726882412.25503: in VariableManager get_vars() 13040 1726882412.25566: Calling all_inventory to load vars for managed_node1 13040 1726882412.25569: Calling groups_inventory to load vars for managed_node1 13040 1726882412.25572: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882412.25584: Calling all_plugins_play to load vars for managed_node1 13040 1726882412.25587: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882412.25590: Calling groups_plugins_play to load vars for managed_node1 13040 1726882412.25787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882412.26083: done with get_vars() 13040 1726882412.26093: done getting variables 13040 1726882412.26158: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13040 1726882412.27020: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000001b5 13040 1726882412.27024: WORKER PROCESS EXITING TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Friday 20 September 2024 21:33:32 -0400 (0:00:00.078) 0:00:09.747 ****** 13040 1726882412.27039: entering _queue_task() for managed_node1/shell 13040 1726882412.27306: worker is 1 (out of 1 available) 13040 1726882412.27320: exiting _queue_task() for managed_node1/shell 13040 1726882412.27332: done queuing things up, now waiting for results queue to drain 13040 1726882412.27333: waiting for pending results... 13040 1726882412.28239: running TaskExecutor() for managed_node1/TASK: Stop dnsmasq/radvd services 13040 1726882412.28604: in run() - task 0e448fcc-3ce9-b123-314b-0000000001b6 13040 1726882412.28625: variable 'ansible_search_path' from source: unknown 13040 1726882412.28633: variable 'ansible_search_path' from source: unknown 13040 1726882412.28794: calling self._execute() 13040 1726882412.28894: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882412.29018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882412.29034: variable 'omit' from source: magic vars 13040 1726882412.29844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882412.34508: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882412.34608: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882412.34649: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882412.34694: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882412.34725: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882412.34812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882412.34845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882412.34884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882412.34929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882412.34947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882412.35102: variable 'ansible_distribution' from source: facts 13040 1726882412.35113: variable 'ansible_distribution_major_version' from source: facts 13040 1726882412.35135: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882412.35142: when evaluation is False, skipping this task 13040 1726882412.35149: _execute() done 13040 1726882412.35158: dumping result to json 13040 1726882412.35168: done dumping result, returning 13040 1726882412.35179: done running TaskExecutor() for managed_node1/TASK: Stop dnsmasq/radvd services [0e448fcc-3ce9-b123-314b-0000000001b6] 13040 1726882412.35188: sending task result for task 0e448fcc-3ce9-b123-314b-0000000001b6 13040 1726882412.35301: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000001b6 13040 1726882412.35310: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882412.35360: no more pending results, returning what we have 13040 1726882412.35366: results queue empty 13040 1726882412.35367: checking for any_errors_fatal 13040 1726882412.35375: done checking for any_errors_fatal 13040 1726882412.35376: checking for max_fail_percentage 13040 1726882412.35378: done checking for max_fail_percentage 13040 1726882412.35379: checking to see if all hosts have failed and the running result is not ok 13040 1726882412.35380: done checking to see if all hosts have failed 13040 1726882412.35380: getting the remaining hosts for this loop 13040 1726882412.35382: done getting the remaining hosts for this loop 13040 1726882412.35385: getting the next task for host managed_node1 13040 1726882412.35393: done getting next task for host managed_node1 13040 1726882412.35396: ^ task is: TASK: Restore the /etc/resolv.conf for initscript 13040 1726882412.35399: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13040 1726882412.35403: getting variables 13040 1726882412.35405: in VariableManager get_vars() 13040 1726882412.35465: Calling all_inventory to load vars for managed_node1 13040 1726882412.35469: Calling groups_inventory to load vars for managed_node1 13040 1726882412.35472: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882412.35485: Calling all_plugins_play to load vars for managed_node1 13040 1726882412.35487: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882412.35491: Calling groups_plugins_play to load vars for managed_node1 13040 1726882412.35683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882412.35939: done with get_vars() 13040 1726882412.35954: done getting variables 13040 1726882412.36017: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Restore the /etc/resolv.conf for initscript] ***************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:248 Friday 20 September 2024 21:33:32 -0400 (0:00:00.090) 0:00:09.837 ****** 13040 1726882412.36055: entering _queue_task() for managed_node1/command 13040 1726882412.36796: worker is 1 (out of 1 available) 13040 1726882412.36809: exiting _queue_task() for managed_node1/command 13040 1726882412.36821: done queuing things up, now waiting for results queue to drain 13040 1726882412.36823: waiting for pending results... 13040 1726882412.37513: running TaskExecutor() for managed_node1/TASK: Restore the /etc/resolv.conf for initscript 13040 1726882412.37736: in run() - task 0e448fcc-3ce9-b123-314b-0000000001b7 13040 1726882412.37808: variable 'ansible_search_path' from source: unknown 13040 1726882412.37862: calling self._execute() 13040 1726882412.37994: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882412.38005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882412.38019: variable 'omit' from source: magic vars 13040 1726882412.38498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882412.41522: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882412.41639: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882412.41742: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882412.41801: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882412.41836: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882412.41923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882412.41961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882412.41996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882412.42045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882412.42071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882412.42219: variable 'ansible_distribution' from source: facts 13040 1726882412.42230: variable 'ansible_distribution_major_version' from source: facts 13040 1726882412.42259: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882412.42271: when evaluation is False, skipping this task 13040 1726882412.42279: _execute() done 13040 1726882412.42284: dumping result to json 13040 1726882412.42290: done dumping result, returning 13040 1726882412.42301: done running TaskExecutor() for managed_node1/TASK: Restore the /etc/resolv.conf for initscript [0e448fcc-3ce9-b123-314b-0000000001b7] 13040 1726882412.42311: sending task result for task 0e448fcc-3ce9-b123-314b-0000000001b7 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882412.42460: no more pending results, returning what we have 13040 1726882412.42466: results queue empty 13040 1726882412.42467: checking for any_errors_fatal 13040 1726882412.42474: done checking for any_errors_fatal 13040 1726882412.42475: checking for max_fail_percentage 13040 1726882412.42478: done checking for max_fail_percentage 13040 1726882412.42479: checking to see if all hosts have failed and the running result is not ok 13040 1726882412.42480: done checking to see if all hosts have failed 13040 1726882412.42480: getting the remaining hosts for this loop 13040 1726882412.42482: done getting the remaining hosts for this loop 13040 1726882412.42486: getting the next task for host managed_node1 13040 1726882412.42494: done getting next task for host managed_node1 13040 1726882412.42497: ^ task is: TASK: Verify network state restored to default 13040 1726882412.42500: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13040 1726882412.42504: getting variables 13040 1726882412.42506: in VariableManager get_vars() 13040 1726882412.42571: Calling all_inventory to load vars for managed_node1 13040 1726882412.42576: Calling groups_inventory to load vars for managed_node1 13040 1726882412.42579: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882412.42590: Calling all_plugins_play to load vars for managed_node1 13040 1726882412.42593: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882412.42596: Calling groups_plugins_play to load vars for managed_node1 13040 1726882412.42837: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882412.43042: done with get_vars() 13040 1726882412.43055: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:253 Friday 20 September 2024 21:33:32 -0400 (0:00:00.070) 0:00:09.908 ****** 13040 1726882412.43157: entering _queue_task() for managed_node1/include_tasks 13040 1726882412.43178: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000001b7 13040 1726882412.43186: WORKER PROCESS EXITING 13040 1726882412.43848: worker is 1 (out of 1 available) 13040 1726882412.43865: exiting _queue_task() for managed_node1/include_tasks 13040 1726882412.43877: done queuing things up, now waiting for results queue to drain 13040 1726882412.43878: waiting for pending results... 13040 1726882412.44168: running TaskExecutor() for managed_node1/TASK: Verify network state restored to default 13040 1726882412.44281: in run() - task 0e448fcc-3ce9-b123-314b-0000000001b8 13040 1726882412.44301: variable 'ansible_search_path' from source: unknown 13040 1726882412.44348: calling self._execute() 13040 1726882412.44466: variable 'ansible_host' from source: host vars for 'managed_node1' 13040 1726882412.44478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 13040 1726882412.44495: variable 'omit' from source: magic vars 13040 1726882412.44960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13040 1726882412.48842: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13040 1726882412.48933: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13040 1726882412.48977: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13040 1726882412.49023: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13040 1726882412.49054: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13040 1726882412.49142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13040 1726882412.49180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13040 1726882412.49240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13040 1726882412.49303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13040 1726882412.49331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13040 1726882412.49480: variable 'ansible_distribution' from source: facts 13040 1726882412.49491: variable 'ansible_distribution_major_version' from source: facts 13040 1726882412.49513: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 13040 1726882412.49522: when evaluation is False, skipping this task 13040 1726882412.49534: _execute() done 13040 1726882412.49540: dumping result to json 13040 1726882412.49547: done dumping result, returning 13040 1726882412.49557: done running TaskExecutor() for managed_node1/TASK: Verify network state restored to default [0e448fcc-3ce9-b123-314b-0000000001b8] 13040 1726882412.49571: sending task result for task 0e448fcc-3ce9-b123-314b-0000000001b8 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 13040 1726882412.49807: no more pending results, returning what we have 13040 1726882412.49812: results queue empty 13040 1726882412.49813: checking for any_errors_fatal 13040 1726882412.49818: done checking for any_errors_fatal 13040 1726882412.49819: checking for max_fail_percentage 13040 1726882412.49820: done checking for max_fail_percentage 13040 1726882412.49821: checking to see if all hosts have failed and the running result is not ok 13040 1726882412.49822: done checking to see if all hosts have failed 13040 1726882412.49823: getting the remaining hosts for this loop 13040 1726882412.49824: done getting the remaining hosts for this loop 13040 1726882412.49829: getting the next task for host managed_node1 13040 1726882412.49837: done getting next task for host managed_node1 13040 1726882412.49839: ^ task is: TASK: meta (flush_handlers) 13040 1726882412.49841: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882412.49845: getting variables 13040 1726882412.49848: in VariableManager get_vars() 13040 1726882412.49910: Calling all_inventory to load vars for managed_node1 13040 1726882412.49914: Calling groups_inventory to load vars for managed_node1 13040 1726882412.49917: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882412.49928: Calling all_plugins_play to load vars for managed_node1 13040 1726882412.49931: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882412.49935: Calling groups_plugins_play to load vars for managed_node1 13040 1726882412.50134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882412.50400: done with get_vars() 13040 1726882412.50419: done getting variables 13040 1726882412.50536: done sending task result for task 0e448fcc-3ce9-b123-314b-0000000001b8 13040 1726882412.50540: WORKER PROCESS EXITING 13040 1726882412.50592: in VariableManager get_vars() 13040 1726882412.50613: Calling all_inventory to load vars for managed_node1 13040 1726882412.50615: Calling groups_inventory to load vars for managed_node1 13040 1726882412.50618: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882412.50738: Calling all_plugins_play to load vars for managed_node1 13040 1726882412.50743: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882412.50747: Calling groups_plugins_play to load vars for managed_node1 13040 1726882412.51006: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882412.51273: done with get_vars() 13040 1726882412.51295: done queuing things up, now waiting for results queue to drain 13040 1726882412.51298: results queue empty 13040 1726882412.51298: checking for any_errors_fatal 13040 1726882412.51301: done checking for any_errors_fatal 13040 1726882412.51302: checking for max_fail_percentage 13040 1726882412.51303: done checking for max_fail_percentage 13040 1726882412.51304: checking to see if all hosts have failed and the running result is not ok 13040 1726882412.51305: done checking to see if all hosts have failed 13040 1726882412.51305: getting the remaining hosts for this loop 13040 1726882412.51306: done getting the remaining hosts for this loop 13040 1726882412.51309: getting the next task for host managed_node1 13040 1726882412.51313: done getting next task for host managed_node1 13040 1726882412.51314: ^ task is: TASK: meta (flush_handlers) 13040 1726882412.51316: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882412.51319: getting variables 13040 1726882412.51320: in VariableManager get_vars() 13040 1726882412.51340: Calling all_inventory to load vars for managed_node1 13040 1726882412.51342: Calling groups_inventory to load vars for managed_node1 13040 1726882412.51344: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882412.51354: Calling all_plugins_play to load vars for managed_node1 13040 1726882412.51357: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882412.51360: Calling groups_plugins_play to load vars for managed_node1 13040 1726882412.51519: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882412.51695: done with get_vars() 13040 1726882412.51703: done getting variables 13040 1726882412.51752: in VariableManager get_vars() 13040 1726882412.51773: Calling all_inventory to load vars for managed_node1 13040 1726882412.51776: Calling groups_inventory to load vars for managed_node1 13040 1726882412.51778: Calling all_plugins_inventory to load vars for managed_node1 13040 1726882412.51782: Calling all_plugins_play to load vars for managed_node1 13040 1726882412.51784: Calling groups_plugins_inventory to load vars for managed_node1 13040 1726882412.51786: Calling groups_plugins_play to load vars for managed_node1 13040 1726882412.51929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13040 1726882412.52191: done with get_vars() 13040 1726882412.52203: done queuing things up, now waiting for results queue to drain 13040 1726882412.52205: results queue empty 13040 1726882412.52206: checking for any_errors_fatal 13040 1726882412.52207: done checking for any_errors_fatal 13040 1726882412.52208: checking for max_fail_percentage 13040 1726882412.52209: done checking for max_fail_percentage 13040 1726882412.52210: checking to see if all hosts have failed and the running result is not ok 13040 1726882412.52210: done checking to see if all hosts have failed 13040 1726882412.52211: getting the remaining hosts for this loop 13040 1726882412.52212: done getting the remaining hosts for this loop 13040 1726882412.52215: getting the next task for host managed_node1 13040 1726882412.52218: done getting next task for host managed_node1 13040 1726882412.52219: ^ task is: None 13040 1726882412.52220: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13040 1726882412.52221: done queuing things up, now waiting for results queue to drain 13040 1726882412.52222: results queue empty 13040 1726882412.52223: checking for any_errors_fatal 13040 1726882412.52223: done checking for any_errors_fatal 13040 1726882412.52224: checking for max_fail_percentage 13040 1726882412.52225: done checking for max_fail_percentage 13040 1726882412.52226: checking to see if all hosts have failed and the running result is not ok 13040 1726882412.52226: done checking to see if all hosts have failed 13040 1726882412.52228: getting the next task for host managed_node1 13040 1726882412.52231: done getting next task for host managed_node1 13040 1726882412.52231: ^ task is: None 13040 1726882412.52232: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node1 : ok=7 changed=0 unreachable=0 failed=0 skipped=151 rescued=0 ignored=0 Friday 20 September 2024 21:33:32 -0400 (0:00:00.091) 0:00:10.000 ****** =============================================================================== Gathering Facts --------------------------------------------------------- 1.20s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_initscripts.yml:5 Gather the minimum subset of ansible_facts required by the network role test --- 0.55s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Check if system is ostree ----------------------------------------------- 0.47s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 fedora.linux_system_roles.network : Enable network service -------------- 0.09s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Verify network state restored to default -------------------------------- 0.09s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:253 Stop dnsmasq/radvd services --------------------------------------------- 0.09s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 fedora.linux_system_roles.network : Print network provider -------------- 0.09s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable --- 0.09s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Remove test interfaces -------------------------------------------------- 0.08s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 fedora.linux_system_roles.network : Enable and start wpa_supplicant ----- 0.08s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider --- 0.08s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable --- 0.08s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Delete the device 'nm-bond' --------------------------------------------- 0.08s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:242 fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces --- 0.07s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 fedora.linux_system_roles.network : Enable and start wpa_supplicant ----- 0.07s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 fedora.linux_system_roles.network : Ensure initscripts network file dependency is present --- 0.07s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable --- 0.07s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.07s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Configure networking state ---------- 0.07s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Restore the /etc/resolv.conf for initscript ----------------------------- 0.07s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:248 13040 1726882412.52350: RUNNING CLEANUP