[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 33192 1726883086.17575: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-4FB executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 7 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 33192 1726883086.17936: Added group all to inventory 33192 1726883086.17938: Added group ungrouped to inventory 33192 1726883086.17942: Group all now contains ungrouped 33192 1726883086.17945: Examining possible inventory source: /tmp/network-lQx/inventory.yml 33192 1726883086.30027: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 33192 1726883086.30081: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 33192 1726883086.30101: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 33192 1726883086.30150: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 33192 1726883086.30213: Loaded config def from plugin (inventory/script) 33192 1726883086.30215: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 33192 1726883086.30249: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 33192 1726883086.30322: Loaded config def from plugin (inventory/yaml) 33192 1726883086.30324: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 33192 1726883086.30396: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 33192 1726883086.30750: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 33192 1726883086.30753: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 33192 1726883086.30755: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 33192 1726883086.30760: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 33192 1726883086.30763: Loading data from /tmp/network-lQx/inventory.yml 33192 1726883086.30819: /tmp/network-lQx/inventory.yml was not parsable by auto 33192 1726883086.30875: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 33192 1726883086.30906: Loading data from /tmp/network-lQx/inventory.yml 33192 1726883086.30975: group all already in inventory 33192 1726883086.30981: set inventory_file for managed_node1 33192 1726883086.30984: set inventory_dir for managed_node1 33192 1726883086.30985: Added host managed_node1 to inventory 33192 1726883086.30987: Added host managed_node1 to group all 33192 1726883086.30988: set ansible_host for managed_node1 33192 1726883086.30988: set ansible_ssh_extra_args for managed_node1 33192 1726883086.30991: set inventory_file for managed_node2 33192 1726883086.30993: set inventory_dir for managed_node2 33192 1726883086.30993: Added host managed_node2 to inventory 33192 1726883086.30994: Added host managed_node2 to group all 33192 1726883086.30995: set ansible_host for managed_node2 33192 1726883086.30996: set ansible_ssh_extra_args for managed_node2 33192 1726883086.30998: set inventory_file for managed_node3 33192 1726883086.30999: set inventory_dir for managed_node3 33192 1726883086.31000: Added host managed_node3 to inventory 33192 1726883086.31001: Added host managed_node3 to group all 33192 1726883086.31001: set ansible_host for managed_node3 33192 1726883086.31002: set ansible_ssh_extra_args for managed_node3 33192 1726883086.31004: Reconcile groups and hosts in inventory. 33192 1726883086.31007: Group ungrouped now contains managed_node1 33192 1726883086.31008: Group ungrouped now contains managed_node2 33192 1726883086.31010: Group ungrouped now contains managed_node3 33192 1726883086.31076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 33192 1726883086.31181: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 33192 1726883086.31220: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 33192 1726883086.31244: Loaded config def from plugin (vars/host_group_vars) 33192 1726883086.31246: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 33192 1726883086.31251: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 33192 1726883086.31259: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 33192 1726883086.31298: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 33192 1726883086.31557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883086.31637: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 33192 1726883086.31668: Loaded config def from plugin (connection/local) 33192 1726883086.31670: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 33192 1726883086.32186: Loaded config def from plugin (connection/paramiko_ssh) 33192 1726883086.32189: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 33192 1726883086.32909: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 33192 1726883086.32944: Loaded config def from plugin (connection/psrp) 33192 1726883086.32947: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 33192 1726883086.33532: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 33192 1726883086.33567: Loaded config def from plugin (connection/ssh) 33192 1726883086.33569: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 33192 1726883086.35119: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 33192 1726883086.35152: Loaded config def from plugin (connection/winrm) 33192 1726883086.35154: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 33192 1726883086.35180: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 33192 1726883086.35235: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 33192 1726883086.35291: Loaded config def from plugin (shell/cmd) 33192 1726883086.35293: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 33192 1726883086.35315: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 33192 1726883086.35372: Loaded config def from plugin (shell/powershell) 33192 1726883086.35374: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 33192 1726883086.35416: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 33192 1726883086.35613: Loaded config def from plugin (shell/sh) 33192 1726883086.35615: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 33192 1726883086.35644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 33192 1726883086.35748: Loaded config def from plugin (become/runas) 33192 1726883086.35750: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 33192 1726883086.35904: Loaded config def from plugin (become/su) 33192 1726883086.35906: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 33192 1726883086.36039: Loaded config def from plugin (become/sudo) 33192 1726883086.36041: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 33192 1726883086.36068: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml 33192 1726883086.36331: in VariableManager get_vars() 33192 1726883086.36348: done with get_vars() 33192 1726883086.36454: trying /usr/local/lib/python3.12/site-packages/ansible/modules 33192 1726883086.39362: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 33192 1726883086.39454: in VariableManager get_vars() 33192 1726883086.39457: done with get_vars() 33192 1726883086.39460: variable 'playbook_dir' from source: magic vars 33192 1726883086.39460: variable 'ansible_playbook_python' from source: magic vars 33192 1726883086.39461: variable 'ansible_config_file' from source: magic vars 33192 1726883086.39461: variable 'groups' from source: magic vars 33192 1726883086.39462: variable 'omit' from source: magic vars 33192 1726883086.39463: variable 'ansible_version' from source: magic vars 33192 1726883086.39463: variable 'ansible_check_mode' from source: magic vars 33192 1726883086.39464: variable 'ansible_diff_mode' from source: magic vars 33192 1726883086.39464: variable 'ansible_forks' from source: magic vars 33192 1726883086.39465: variable 'ansible_inventory_sources' from source: magic vars 33192 1726883086.39466: variable 'ansible_skip_tags' from source: magic vars 33192 1726883086.39466: variable 'ansible_limit' from source: magic vars 33192 1726883086.39467: variable 'ansible_run_tags' from source: magic vars 33192 1726883086.39467: variable 'ansible_verbosity' from source: magic vars 33192 1726883086.39497: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml 33192 1726883086.39966: in VariableManager get_vars() 33192 1726883086.39981: done with get_vars() 33192 1726883086.40087: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 33192 1726883086.40250: in VariableManager get_vars() 33192 1726883086.40262: done with get_vars() 33192 1726883086.40266: variable 'omit' from source: magic vars 33192 1726883086.40281: variable 'omit' from source: magic vars 33192 1726883086.40309: in VariableManager get_vars() 33192 1726883086.40318: done with get_vars() 33192 1726883086.40357: in VariableManager get_vars() 33192 1726883086.40367: done with get_vars() 33192 1726883086.40397: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 33192 1726883086.40575: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 33192 1726883086.40683: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 33192 1726883086.41414: in VariableManager get_vars() 33192 1726883086.41438: done with get_vars() 33192 1726883086.41932: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 33192 1726883086.42124: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 33192 1726883086.44439: in VariableManager get_vars() 33192 1726883086.44461: done with get_vars() 33192 1726883086.44467: variable 'omit' from source: magic vars 33192 1726883086.44483: variable 'omit' from source: magic vars 33192 1726883086.44525: in VariableManager get_vars() 33192 1726883086.44557: done with get_vars() 33192 1726883086.44591: in VariableManager get_vars() 33192 1726883086.44611: done with get_vars() 33192 1726883086.44646: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 33192 1726883086.44803: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 33192 1726883086.44911: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 33192 1726883086.46990: in VariableManager get_vars() 33192 1726883086.47022: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 33192 1726883086.49583: in VariableManager get_vars() 33192 1726883086.49607: done with get_vars() 33192 1726883086.49612: variable 'omit' from source: magic vars 33192 1726883086.49625: variable 'omit' from source: magic vars 33192 1726883086.49666: in VariableManager get_vars() 33192 1726883086.49689: done with get_vars() 33192 1726883086.49715: in VariableManager get_vars() 33192 1726883086.49739: done with get_vars() 33192 1726883086.49773: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 33192 1726883086.49943: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 33192 1726883086.50048: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 33192 1726883086.50549: in VariableManager get_vars() 33192 1726883086.50583: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 33192 1726883086.53195: in VariableManager get_vars() 33192 1726883086.53224: done with get_vars() 33192 1726883086.53230: variable 'omit' from source: magic vars 33192 1726883086.53266: variable 'omit' from source: magic vars 33192 1726883086.53320: in VariableManager get_vars() 33192 1726883086.53346: done with get_vars() 33192 1726883086.53374: in VariableManager get_vars() 33192 1726883086.53400: done with get_vars() 33192 1726883086.53435: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 33192 1726883086.53612: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 33192 1726883086.53728: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 33192 1726883086.54300: in VariableManager get_vars() 33192 1726883086.54332: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 33192 1726883086.56801: in VariableManager get_vars() 33192 1726883086.56829: done with get_vars() 33192 1726883086.56882: in VariableManager get_vars() 33192 1726883086.56913: done with get_vars() 33192 1726883086.56988: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 33192 1726883086.57010: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 33192 1726883086.57314: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 33192 1726883086.57532: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 33192 1726883086.57537: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-4FB/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 33192 1726883086.57575: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 33192 1726883086.57605: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 33192 1726883086.57839: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 33192 1726883086.57920: Loaded config def from plugin (callback/default) 33192 1726883086.57922: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 33192 1726883086.59520: Loaded config def from plugin (callback/junit) 33192 1726883086.59524: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 33192 1726883086.59588: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 33192 1726883086.59689: Loaded config def from plugin (callback/minimal) 33192 1726883086.59693: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 33192 1726883086.59745: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 33192 1726883086.59854: Loaded config def from plugin (callback/tree) 33192 1726883086.59857: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 33192 1726883086.60118: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 33192 1726883086.60121: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-4FB/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_wireless_nm.yml ************************************************ 2 plays in /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml 33192 1726883086.60152: in VariableManager get_vars() 33192 1726883086.60165: done with get_vars() 33192 1726883086.60173: in VariableManager get_vars() 33192 1726883086.60183: done with get_vars() 33192 1726883086.60187: variable 'omit' from source: magic vars 33192 1726883086.60227: in VariableManager get_vars() 33192 1726883086.60353: done with get_vars() 33192 1726883086.60378: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_wireless.yml' with nm as provider] ********* 33192 1726883086.61048: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 33192 1726883086.61140: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 33192 1726883086.61177: getting the remaining hosts for this loop 33192 1726883086.61179: done getting the remaining hosts for this loop 33192 1726883086.61182: getting the next task for host managed_node1 33192 1726883086.61187: done getting next task for host managed_node1 33192 1726883086.61189: ^ task is: TASK: Gathering Facts 33192 1726883086.61191: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883086.61194: getting variables 33192 1726883086.61195: in VariableManager get_vars() 33192 1726883086.61207: Calling all_inventory to load vars for managed_node1 33192 1726883086.61210: Calling groups_inventory to load vars for managed_node1 33192 1726883086.61213: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883086.61227: Calling all_plugins_play to load vars for managed_node1 33192 1726883086.61246: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883086.61250: Calling groups_plugins_play to load vars for managed_node1 33192 1726883086.61300: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883086.61368: done with get_vars() 33192 1726883086.61376: done getting variables 33192 1726883086.61477: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:6 Friday 20 September 2024 21:44:46 -0400 (0:00:00.015) 0:00:00.015 ****** 33192 1726883086.61508: entering _queue_task() for managed_node1/gather_facts 33192 1726883086.61510: Creating lock for gather_facts 33192 1726883086.61893: worker is 1 (out of 1 available) 33192 1726883086.61904: exiting _queue_task() for managed_node1/gather_facts 33192 1726883086.61919: done queuing things up, now waiting for results queue to drain 33192 1726883086.61921: waiting for pending results... 33192 1726883086.62178: running TaskExecutor() for managed_node1/TASK: Gathering Facts 33192 1726883086.62341: in run() - task 0affe814-3a2d-6c15-6a7e-000000000147 33192 1726883086.62345: variable 'ansible_search_path' from source: unknown 33192 1726883086.62357: calling self._execute() 33192 1726883086.62441: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883086.62457: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883086.62475: variable 'omit' from source: magic vars 33192 1726883086.62709: variable 'omit' from source: magic vars 33192 1726883086.62713: variable 'omit' from source: magic vars 33192 1726883086.62715: variable 'omit' from source: magic vars 33192 1726883086.62747: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33192 1726883086.62792: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33192 1726883086.62826: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33192 1726883086.62855: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33192 1726883086.62872: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33192 1726883086.62910: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33192 1726883086.62928: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883086.62939: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883086.63064: Set connection var ansible_shell_type to sh 33192 1726883086.63079: Set connection var ansible_connection to ssh 33192 1726883086.63098: Set connection var ansible_timeout to 10 33192 1726883086.63109: Set connection var ansible_module_compression to ZIP_DEFLATED 33192 1726883086.63120: Set connection var ansible_pipelining to False 33192 1726883086.63131: Set connection var ansible_shell_executable to /bin/sh 33192 1726883086.63166: variable 'ansible_shell_executable' from source: unknown 33192 1726883086.63175: variable 'ansible_connection' from source: unknown 33192 1726883086.63183: variable 'ansible_module_compression' from source: unknown 33192 1726883086.63190: variable 'ansible_shell_type' from source: unknown 33192 1726883086.63197: variable 'ansible_shell_executable' from source: unknown 33192 1726883086.63204: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883086.63212: variable 'ansible_pipelining' from source: unknown 33192 1726883086.63253: variable 'ansible_timeout' from source: unknown 33192 1726883086.63256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883086.63445: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 33192 1726883086.63467: variable 'omit' from source: magic vars 33192 1726883086.63481: starting attempt loop 33192 1726883086.63488: running the handler 33192 1726883086.63539: variable 'ansible_facts' from source: unknown 33192 1726883086.63542: _low_level_execute_command(): starting 33192 1726883086.63545: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 33192 1726883086.64322: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 33192 1726883086.64343: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33192 1726883086.64360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33192 1726883086.64467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33192 1726883086.64496: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33192 1726883086.64513: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33192 1726883086.64533: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33192 1726883086.64644: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33192 1726883086.66405: stdout chunk (state=3): >>>/root <<< 33192 1726883086.66596: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33192 1726883086.66602: stderr chunk (state=3): >>><<< 33192 1726883086.66607: stdout chunk (state=3): >>><<< 33192 1726883086.66840: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33192 1726883086.66844: _low_level_execute_command(): starting 33192 1726883086.66847: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883086.6664639-33203-249175268240532 `" && echo ansible-tmp-1726883086.6664639-33203-249175268240532="` echo /root/.ansible/tmp/ansible-tmp-1726883086.6664639-33203-249175268240532 `" ) && sleep 0' 33192 1726883086.67500: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 33192 1726883086.67509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33192 1726883086.67525: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found <<< 33192 1726883086.67532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33192 1726883086.67542: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33192 1726883086.67550: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration <<< 33192 1726883086.67565: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33192 1726883086.67573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found <<< 33192 1726883086.67582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33192 1726883086.67665: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33192 1726883086.67672: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33192 1726883086.67682: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33192 1726883086.67767: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33192 1726883086.69755: stdout chunk (state=3): >>>ansible-tmp-1726883086.6664639-33203-249175268240532=/root/.ansible/tmp/ansible-tmp-1726883086.6664639-33203-249175268240532 <<< 33192 1726883086.69937: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33192 1726883086.69948: stdout chunk (state=3): >>><<< 33192 1726883086.69965: stderr chunk (state=3): >>><<< 33192 1726883086.69985: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883086.6664639-33203-249175268240532=/root/.ansible/tmp/ansible-tmp-1726883086.6664639-33203-249175268240532 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33192 1726883086.70021: variable 'ansible_module_compression' from source: unknown 33192 1726883086.70092: ANSIBALLZ: Using generic lock for ansible.legacy.setup 33192 1726883086.70100: ANSIBALLZ: Acquiring lock 33192 1726883086.70108: ANSIBALLZ: Lock acquired: 140092633062224 33192 1726883086.70116: ANSIBALLZ: Creating module 33192 1726883087.22231: ANSIBALLZ: Writing module into payload 33192 1726883087.22552: ANSIBALLZ: Writing module 33192 1726883087.22773: ANSIBALLZ: Renaming module 33192 1726883087.22777: ANSIBALLZ: Done creating module 33192 1726883087.22843: variable 'ansible_facts' from source: unknown 33192 1726883087.22856: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33192 1726883087.22864: _low_level_execute_command(): starting 33192 1726883087.22873: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 33192 1726883087.24149: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33192 1726883087.24377: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 33192 1726883087.24403: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33192 1726883087.24569: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33192 1726883087.26468: stdout chunk (state=3): >>>PLATFORM <<< 33192 1726883087.26512: stdout chunk (state=3): >>>Linux <<< 33192 1726883087.26549: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 33192 1726883087.26676: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33192 1726883087.26758: stderr chunk (state=3): >>><<< 33192 1726883087.26879: stdout chunk (state=3): >>><<< 33192 1726883087.26924: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33192 1726883087.26930 [managed_node1]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 33192 1726883087.26963: _low_level_execute_command(): starting 33192 1726883087.26979: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 33192 1726883087.27340: Sending initial data 33192 1726883087.27344: Sent initial data (1181 bytes) 33192 1726883087.28311: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33192 1726883087.28548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 33192 1726883087.28573: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33192 1726883087.28659: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33192 1726883087.32312: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"39 (Thirty Nine)\"\nID=fedora\nVERSION_ID=39\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f39\"\nPRETTY_NAME=\"Fedora Linux 39 (Thirty Nine)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:39\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=39\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=39\nSUPPORT_END=2024-11-12\n"} <<< 33192 1726883087.32741: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33192 1726883087.32783: stderr chunk (state=3): >>><<< 33192 1726883087.32786: stdout chunk (state=3): >>><<< 33192 1726883087.32804: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"39 (Thirty Nine)\"\nID=fedora\nVERSION_ID=39\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f39\"\nPRETTY_NAME=\"Fedora Linux 39 (Thirty Nine)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:39\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=39\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=39\nSUPPORT_END=2024-11-12\n"} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33192 1726883087.32915: variable 'ansible_facts' from source: unknown 33192 1726883087.32918: variable 'ansible_facts' from source: unknown 33192 1726883087.32933: variable 'ansible_module_compression' from source: unknown 33192 1726883087.33453: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-33192zxvjc6ee/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 33192 1726883087.33539: variable 'ansible_facts' from source: unknown 33192 1726883087.33693: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883086.6664639-33203-249175268240532/AnsiballZ_setup.py 33192 1726883087.34559: Sending initial data 33192 1726883087.34569: Sent initial data (154 bytes) 33192 1726883087.35570: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 33192 1726883087.35576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33192 1726883087.35592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33192 1726883087.35644: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33192 1726883087.35648: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration <<< 33192 1726883087.35651: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33192 1726883087.35948: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 33192 1726883087.36044: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33192 1726883087.36098: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33192 1726883087.37982: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 33192 1726883087.37997: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 33192 1726883087.38032: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883086.6664639-33203-249175268240532/AnsiballZ_setup.py" <<< 33192 1726883087.38038: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-33192zxvjc6ee/tmpy0kaynb4 /root/.ansible/tmp/ansible-tmp-1726883086.6664639-33203-249175268240532/AnsiballZ_setup.py <<< 33192 1726883087.38083: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-33192zxvjc6ee/tmpy0kaynb4" to remote "/root/.ansible/tmp/ansible-tmp-1726883086.6664639-33203-249175268240532/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883086.6664639-33203-249175268240532/AnsiballZ_setup.py" <<< 33192 1726883087.43047: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33192 1726883087.43062: stderr chunk (state=3): >>><<< 33192 1726883087.43119: stdout chunk (state=3): >>><<< 33192 1726883087.43157: done transferring module to remote 33192 1726883087.43239: _low_level_execute_command(): starting 33192 1726883087.43259: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883086.6664639-33203-249175268240532/ /root/.ansible/tmp/ansible-tmp-1726883086.6664639-33203-249175268240532/AnsiballZ_setup.py && sleep 0' 33192 1726883087.44426: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 33192 1726883087.44744: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 33192 1726883087.44752: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33192 1726883087.44839: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33192 1726883087.46715: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33192 1726883087.46786: stderr chunk (state=3): >>><<< 33192 1726883087.46950: stdout chunk (state=3): >>><<< 33192 1726883087.47041: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33192 1726883087.47044: _low_level_execute_command(): starting 33192 1726883087.47053: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883086.6664639-33203-249175268240532/AnsiballZ_setup.py && sleep 0' 33192 1726883087.48140: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 33192 1726883087.48190: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33192 1726883087.48193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33192 1726883087.48197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33192 1726883087.48199: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration <<< 33192 1726883087.48201: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33192 1726883087.48204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33192 1726883087.48558: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 33192 1726883087.48576: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33192 1726883087.50743: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 33192 1726883087.50775: stdout chunk (state=3): >>>import _imp # builtin <<< 33192 1726883087.50807: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # <<< 33192 1726883087.50814: stdout chunk (state=3): >>>import '_weakref' # <<< 33192 1726883087.50896: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 33192 1726883087.50930: stdout chunk (state=3): >>>import 'posix' # <<< 33192 1726883087.51025: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 33192 1726883087.51030: stdout chunk (state=3): >>># installing zipimport hook <<< 33192 1726883087.51033: stdout chunk (state=3): >>>import 'time' # <<< 33192 1726883087.51057: stdout chunk (state=3): >>>import 'zipimport' # <<< 33192 1726883087.51068: stdout chunk (state=3): >>># installed zipimport hook <<< 33192 1726883087.51252: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d367b4530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36783b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d367b6ab0> <<< 33192 1726883087.51274: stdout chunk (state=3): >>>import '_signal' # <<< 33192 1726883087.51328: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 33192 1726883087.51333: stdout chunk (state=3): >>>import 'io' # <<< 33192 1726883087.51469: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 33192 1726883087.51496: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # <<< 33192 1726883087.51522: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 33192 1726883087.51533: stdout chunk (state=3): >>>Processing user site-packages <<< 33192 1726883087.51538: stdout chunk (state=3): >>>Processing global site-packages <<< 33192 1726883087.51551: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' <<< 33192 1726883087.51554: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' <<< 33192 1726883087.51557: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 33192 1726883087.51588: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 33192 1726883087.51603: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 33192 1726883087.51611: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36589160> <<< 33192 1726883087.51686: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 33192 1726883087.51714: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 33192 1726883087.51717: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36589fd0> <<< 33192 1726883087.51906: stdout chunk (state=3): >>>import 'site' # Python 3.12.5 (main, Aug 7 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 33192 1726883087.52148: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 33192 1726883087.52281: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 33192 1726883087.52291: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 33192 1726883087.52313: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d365c7dd0> <<< 33192 1726883087.52319: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 33192 1726883087.52344: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 33192 1726883087.52406: stdout chunk (state=3): >>>import '_operator' # <<< 33192 1726883087.52411: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d365c7fe0> <<< 33192 1726883087.52414: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 33192 1726883087.52437: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 33192 1726883087.52443: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 33192 1726883087.52554: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d365ff800> <<< 33192 1726883087.52574: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 33192 1726883087.52577: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 33192 1726883087.52708: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d365ffe90> <<< 33192 1726883087.52748: stdout chunk (state=3): >>>import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d365dfaa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d365dd190> <<< 33192 1726883087.52793: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d365c4f80> <<< 33192 1726883087.52831: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 33192 1726883087.52947: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 33192 1726883087.52966: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36623710> <<< 33192 1726883087.52987: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36622330> <<< 33192 1726883087.53031: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d365de060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36620a40> <<< 33192 1726883087.53176: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d366546b0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d365c4200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d36654b60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d365c6840> <<< 33192 1726883087.53181: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 33192 1726883087.53184: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d36654dd0> <<< 33192 1726883087.53186: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d365c2d20> <<< 33192 1726883087.53253: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 33192 1726883087.53294: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 33192 1726883087.53298: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 33192 1726883087.53485: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d366554c0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36655190> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d366563c0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 33192 1726883087.53511: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d366705c0> import 'errno' # <<< 33192 1726883087.53515: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 33192 1726883087.53537: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d36671d00> <<< 33192 1726883087.53590: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 33192 1726883087.53597: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 33192 1726883087.53632: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36672bd0> <<< 33192 1726883087.53638: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 33192 1726883087.53641: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d36673230> <<< 33192 1726883087.53644: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36672120> <<< 33192 1726883087.53720: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 33192 1726883087.53724: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 33192 1726883087.53727: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d36673c80> <<< 33192 1726883087.53774: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d366733b0> <<< 33192 1726883087.53778: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d366563f0> <<< 33192 1726883087.53954: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d36363b30> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 33192 1726883087.54039: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d3638c620> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d3638c380> <<< 33192 1726883087.54064: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d3638c590> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d3638c7d0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36361cd0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 33192 1726883087.54183: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 33192 1726883087.54336: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 33192 1726883087.54341: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 33192 1726883087.54344: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d3638de50> <<< 33192 1726883087.54369: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d3638cb00> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36656ae0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 33192 1726883087.54390: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 33192 1726883087.54426: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d363be1b0> <<< 33192 1726883087.54483: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 33192 1726883087.54499: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 33192 1726883087.54708: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d363d6360> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 33192 1726883087.54718: stdout chunk (state=3): >>>import 'ntpath' # <<< 33192 1726883087.54721: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py <<< 33192 1726883087.54736: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36413140> <<< 33192 1726883087.54743: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 33192 1726883087.54844: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 33192 1726883087.54852: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 33192 1726883087.54855: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 33192 1726883087.54950: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d364398e0> <<< 33192 1726883087.55021: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36413260> <<< 33192 1726883087.55085: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d363d6ff0> <<< 33192 1726883087.55127: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36260230> <<< 33192 1726883087.55131: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d363d53a0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d3638ed50> <<< 33192 1726883087.55550: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f2d36260410> <<< 33192 1726883087.55559: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_xnyxpsds/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 33192 1726883087.55638: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.55664: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 33192 1726883087.55673: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 33192 1726883087.55717: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 33192 1726883087.55802: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 33192 1726883087.55853: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 33192 1726883087.55857: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d362c5f70> <<< 33192 1726883087.55859: stdout chunk (state=3): >>>import '_typing' # <<< 33192 1726883087.56043: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d3629ce60> <<< 33192 1726883087.56047: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36263fb0> # zipimport: zlib available <<< 33192 1726883087.56075: stdout chunk (state=3): >>>import 'ansible' # <<< 33192 1726883087.56078: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.56110: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.56247: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 33192 1726883087.57695: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.58978: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 33192 1726883087.58999: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d3629fdd0> <<< 33192 1726883087.59007: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 33192 1726883087.59041: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 33192 1726883087.59050: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 33192 1726883087.59153: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 33192 1726883087.59159: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 33192 1726883087.59166: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d362f58b0> <<< 33192 1726883087.59201: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d362f5640> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d362f4f80> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 33192 1726883087.59209: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 33192 1726883087.59347: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d362f5a30> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d362c6990> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d362f65d0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d362f6810> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 33192 1726883087.59479: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 33192 1726883087.59509: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d362f6d50> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 33192 1726883087.59547: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d3615cb60> <<< 33192 1726883087.59576: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 33192 1726883087.59610: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d3615e780> <<< 33192 1726883087.59613: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 33192 1726883087.59616: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 33192 1726883087.60058: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d3615f110> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d3615ff80> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 33192 1726883087.60062: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36162db0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d36162ed0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36161070> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 33192 1726883087.60094: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36166bd0> import '_tokenize' # <<< 33192 1726883087.60168: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d361656a0> <<< 33192 1726883087.60187: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36165400> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 33192 1726883087.60196: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 33192 1726883087.60274: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36167b00> <<< 33192 1726883087.60344: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36161580> <<< 33192 1726883087.60352: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 33192 1726883087.60355: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d361aad20> <<< 33192 1726883087.60367: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d361aaf30> <<< 33192 1726883087.60385: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 33192 1726883087.60415: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 33192 1726883087.60458: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 33192 1726883087.60477: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 33192 1726883087.60482: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d361ac9e0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d361ac7a0> <<< 33192 1726883087.60485: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 33192 1726883087.60757: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d361aeed0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d361ad0d0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 33192 1726883087.60760: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 33192 1726883087.60821: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d361ba6f0> <<< 33192 1726883087.60976: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d361af080> <<< 33192 1726883087.61048: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d361bb440> <<< 33192 1726883087.61083: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d361bb7d0> <<< 33192 1726883087.61145: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d361bba10> <<< 33192 1726883087.61218: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d361ab0b0> <<< 33192 1726883087.61222: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py <<< 33192 1726883087.61225: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 33192 1726883087.61227: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 33192 1726883087.61229: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 33192 1726883087.61254: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 33192 1726883087.61350: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d361beab0> <<< 33192 1726883087.61483: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 33192 1726883087.61486: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 33192 1726883087.61493: stdout chunk (state=3): >>>import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d361bfe60> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d361bd250> <<< 33192 1726883087.61643: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 33192 1726883087.61650: stdout chunk (state=3): >>>import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d361be120> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d361bcdd0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 33192 1726883087.61736: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.61854: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # <<< 33192 1726883087.61869: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.61874: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 33192 1726883087.61877: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.62073: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.62163: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.62852: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.63514: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 33192 1726883087.63661: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d36048050> <<< 33192 1726883087.63964: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36048e90> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36166ba0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 33192 1726883087.64036: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.64216: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 33192 1726883087.64219: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 33192 1726883087.64245: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36049040> # zipimport: zlib available <<< 33192 1726883087.64811: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.65447: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 33192 1726883087.65532: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 33192 1726883087.65543: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.65587: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.65625: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 33192 1726883087.65632: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.65721: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.65849: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 33192 1726883087.65853: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.65956: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available <<< 33192 1726883087.65978: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 33192 1726883087.65985: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.66266: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.66550: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 33192 1726883087.66811: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d3604b2c0> # zipimport: zlib available <<< 33192 1726883087.66837: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.66902: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 33192 1726883087.66909: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 33192 1726883087.66943: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 33192 1726883087.66947: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 33192 1726883087.66949: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 33192 1726883087.67029: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 33192 1726883087.67157: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d36051a90> <<< 33192 1726883087.67206: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 33192 1726883087.67214: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d360523c0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d3604a5d0> <<< 33192 1726883087.67269: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.67506: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available <<< 33192 1726883087.67509: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 33192 1726883087.67575: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 33192 1726883087.67614: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 33192 1726883087.67723: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d36051160> <<< 33192 1726883087.67749: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d360525d0> <<< 33192 1726883087.67780: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 33192 1726883087.67783: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # <<< 33192 1726883087.67803: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.67863: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.67951: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.68024: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.68076: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 33192 1726883087.68139: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 33192 1726883087.68248: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d360e6690> <<< 33192 1726883087.68280: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d3605c2c0> <<< 33192 1726883087.68365: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36056450> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d360562a0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 33192 1726883087.68373: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.68406: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.68442: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 33192 1726883087.68532: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 33192 1726883087.68537: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.68939: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 33192 1726883087.68946: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.68977: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available <<< 33192 1726883087.69078: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.69081: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.69115: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 33192 1726883087.69121: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.69323: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.69517: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.69563: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.69627: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 33192 1726883087.69650: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 33192 1726883087.69668: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 33192 1726883087.69678: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 33192 1726883087.69726: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 33192 1726883087.69735: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d360ecd70> <<< 33192 1726883087.69785: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 33192 1726883087.69788: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 33192 1726883087.69790: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 33192 1726883087.69949: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d35547dd0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d355483e0> <<< 33192 1726883087.70007: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36064e90> <<< 33192 1726883087.70010: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d360642c0> <<< 33192 1726883087.70052: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d360eec30> <<< 33192 1726883087.70057: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d360ee780> <<< 33192 1726883087.70094: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 33192 1726883087.70161: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 33192 1726883087.70174: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 33192 1726883087.70178: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 33192 1726883087.70180: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 33192 1726883087.70183: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 33192 1726883087.70251: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d3554b1d0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d3554aa80> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' <<< 33192 1726883087.70268: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d3554ac60> <<< 33192 1726883087.70298: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d35549eb0> <<< 33192 1726883087.70448: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d3554b2c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 33192 1726883087.70498: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 33192 1726883087.70501: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' <<< 33192 1726883087.70510: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d355bddf0> <<< 33192 1726883087.70558: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d3554bdd0> <<< 33192 1726883087.70577: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d360ee8a0> <<< 33192 1726883087.70839: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # <<< 33192 1726883087.70842: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 33192 1726883087.70845: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.70886: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # <<< 33192 1726883087.70893: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.70913: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 33192 1726883087.70920: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.70963: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.71156: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 33192 1726883087.71165: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.71209: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 33192 1726883087.71217: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.71286: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.71350: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.71414: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.71474: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 33192 1726883087.71484: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # <<< 33192 1726883087.71491: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.72111: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.72650: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 33192 1726883087.72658: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.72715: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.72854: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 33192 1726883087.72862: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.72893: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.72926: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 33192 1726883087.72937: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.72997: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.73153: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 33192 1726883087.73196: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.73276: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 33192 1726883087.73279: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.73306: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.73414: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 33192 1726883087.73443: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d355bf770> <<< 33192 1726883087.73649: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d355be9c0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 33192 1726883087.73711: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.73761: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 33192 1726883087.73768: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.73930: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.73973: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 33192 1726883087.73979: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.74059: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.74126: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 33192 1726883087.74136: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.74185: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.74239: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 33192 1726883087.74285: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 33192 1726883087.74422: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d355ea150> <<< 33192 1726883087.74867: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d355d9f40> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 33192 1726883087.74870: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.75024: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.75077: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.75231: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 33192 1726883087.75256: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # <<< 33192 1726883087.75261: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.75289: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.75332: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 33192 1726883087.75343: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.75749: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d35401be0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d355da930> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 33192 1726883087.75908: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.75984: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 33192 1726883087.75992: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.76108: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.76216: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.76278: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.76307: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 33192 1726883087.76311: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # <<< 33192 1726883087.76331: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.76362: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.76413: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.76582: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.76690: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # <<< 33192 1726883087.76694: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 33192 1726883087.76909: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.76965: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 33192 1726883087.76984: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.77011: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.77160: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.77719: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.78289: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 33192 1726883087.78310: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.78417: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.78541: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 33192 1726883087.78558: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.78686: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.78775: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 33192 1726883087.78833: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.78989: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.79273: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available <<< 33192 1726883087.79280: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 33192 1726883087.79645: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 33192 1726883087.79664: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 33192 1726883087.79853: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.79954: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 33192 1726883087.79970: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.80009: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.80058: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available <<< 33192 1726883087.80084: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.80125: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 33192 1726883087.80203: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.80283: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 33192 1726883087.80304: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.80328: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.80354: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 33192 1726883087.80414: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.80485: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 33192 1726883087.80562: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 33192 1726883087.80622: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 33192 1726883087.80637: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.80923: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.81236: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 33192 1726883087.81240: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.81291: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.81362: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 33192 1726883087.81378: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.81404: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.81452: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available <<< 33192 1726883087.81491: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.81541: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 33192 1726883087.81554: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.81576: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.81619: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 33192 1726883087.81623: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.81732: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.81811: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 33192 1726883087.81862: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.82101: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 33192 1726883087.82104: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.82119: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 33192 1726883087.82181: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.82256: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 33192 1726883087.82288: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 33192 1726883087.82308: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.82332: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.82392: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 33192 1726883087.82419: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.82646: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.82909: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available <<< 33192 1726883087.82969: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 33192 1726883087.83018: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.83080: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 33192 1726883087.83095: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.83170: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.83339: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 33192 1726883087.83343: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.83376: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.83476: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 33192 1726883087.83629: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883087.84156: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 33192 1726883087.84196: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 33192 1726883087.84325: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d3542f470> <<< 33192 1726883087.84329: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d3542e150> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d3542ef30> <<< 33192 1726883087.98696: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 33192 1726883087.98765: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' <<< 33192 1726883087.98801: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d35476a80> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d35474fe0> <<< 33192 1726883087.98984: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' <<< 33192 1726883087.98988: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d35477380> <<< 33192 1726883087.99054: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d35475f70> <<< 33192 1726883087.99211: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 33192 1726883088.23961: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_hostnqn": "", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-217", "ansible_nodename": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec21dae8c3a8315c7fcff8a700ae1140", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2777, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 940, "free": 2777}, "nocache": {"free": 3419, "used": 298}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_uuid": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1042, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251180498944, "block_size": 4096, "block_total": 64483404, "block_available": 61323364, "block_used": 3160040, "inode_total": 16384000, "inode_available": 16303424, "inode_used": 80576, "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd"}], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKNHHarzNQiKV9Fb8htkAo6V5gtUJbuBq7ufermmas6AagMSKqKyQaus7RRYNV0OV6WSVxouvjH4/8553bXF92vINMV37T3BVbSk0VjsDFFAEVkcy7KACT6upREthXzZwLKGK3O4ngGuc4tFf4pQ8aO6/f+Ohm4MzbhCTBhcqJAZAAAAFQClgsX0FPGUtboi3JLlgdUwEKs1QQAAAIBz7qRuyGTAbapZ14FtFLBd/Q0laoIT0Ng+sC/YShWSMBiBZRVJO3mNJQE7grw+G5/0xmxACjGd0+QZ+oyJeoMvQVHzKLhKNCQ5Qcli7GA0RhjCmFSxK8n8AMpfgdqAotUZ6ZM/CW7/H+Ep7tsT8jiMRjKnmn/+91PXtHzBqHvy7wAAAIBqn+Xsrfpj9UiHj75eG8gHsDD4pEVf0sY8iz5WBKk84gO63y8sEtJFcMk4z6d3sc8D+exGAETg/9GTzdTgIPSN1PiLTqVHEtlbgJ+im7iDKmVp6WGUg5p9gh8W0mmFQTtlZueefyvqpe89LjzuKwEioUAMWuj6jCnHVijuYPibng==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC1YAi1e55agg+XKOb96N2Hd6TUtxZ/7W67FkAKMTDd/JPwM9in1rbr68jzlzK4a0rCzng6JYcOJS1960MXsFkr9cKEEyRxrP+OcVVTCP1UBwwu+HeEtgzUGrkUqSozi+NM0AKc3uCoDmTWtndfQoQGBLd32f/hrMJsePHruozn79OIAbnq/odkEwUI1qi2n9hnLb1N5Fl3ftN+fbsO4xuY/yEGFk0z1aAAj7Vgd0BwnGBWIZ/SrGoijI6+YqSTBBu+/3QS+ArkKBr/GfRmxG4m4+VmBbzxjQ3VbpBtdydfkNIwD15OZRKS1cFilWjohPehP3UBvNNKlexDxvBeGPcdKQwz8VQOcbVxNj8TqQNkgfiOUDTqaKwGkLu5EbF+p40d+EpjceP/u40Mh56rEJaAMPWMkPROlGAqQt3naOhKJPg98dWS+w9gK+iW69TgJZtSqqlIoWdmJZQ0W/2R6Buf9ktgOHWYg+t5LZGP2Q6myRQWS/HxB6+hJ2WEB6pDObc=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIVCNaVFEWRPD6ZObUI3I47yORZdevoJeU4h657k6xFMv2EPlOCZq979bRxLfvVP++7xup0OeCRAJPwzE4wIsEg=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICX8RCP0XC2dyBTfIbAYFLUCYwTL55FaNzd8acASiOLe", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_local": {}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_is_chroot": false, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 55312 10.31.10.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 55312 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::bb10:9a17:6b35:7604", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation<<< 33192 1726883088.24165: stdout chunk (state=3): >>>": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.217"], "ansible_all_ipv6_addresses": ["fe80::bb10:9a17:6b35:7604"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.217", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::bb10:9a17:6b35:7604"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.5107421875, "5m": 0.5224609375, "15m": 0.34326171875}, "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "44", "second": "48", "epoch": "1726883088", "epoch_int": "1726883088", "date": "2024-09-20", "time": "21:44:48", "iso8601_micro": "2024-09-21T01:44:48.234511Z", "iso8601": "2024-09-21T01:44:48Z", "iso8601_basic": "20240920T214448234511", "iso8601_basic_short": "20240920T214448", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 33192 1726883088.24564: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 33192 1726883088.24603: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack <<< 33192 1726883088.24739: stdout chunk (state=3): >>># destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info <<< 33192 1726883088.24779: stdout chunk (state=3): >>># destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix <<< 33192 1726883088.24850: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd <<< 33192 1726883088.24945: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 33192 1726883088.25217: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 33192 1726883088.25294: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 33192 1726883088.25313: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 33192 1726883088.25415: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 33192 1726883088.25523: stdout chunk (state=3): >>># destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 33192 1726883088.25582: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle <<< 33192 1726883088.25643: stdout chunk (state=3): >>># destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 <<< 33192 1726883088.25690: stdout chunk (state=3): >>># destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json <<< 33192 1726883088.25977: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection <<< 33192 1726883088.25993: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 33192 1726883088.26128: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 33192 1726883088.26156: stdout chunk (state=3): >>># destroy _collections <<< 33192 1726883088.26204: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 33192 1726883088.26225: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 33192 1726883088.26300: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 33192 1726883088.26313: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 33192 1726883088.26431: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8<<< 33192 1726883088.26437: stdout chunk (state=3): >>> # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 33192 1726883088.26519: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools <<< 33192 1726883088.26532: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 33192 1726883088.27140: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. <<< 33192 1726883088.27143: stdout chunk (state=3): >>><<< 33192 1726883088.27145: stderr chunk (state=3): >>><<< 33192 1726883088.27362: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d367b4530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36783b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d367b6ab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36589160> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36589fd0> import 'site' # Python 3.12.5 (main, Aug 7 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d365c7dd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d365c7fe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d365ff800> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d365ffe90> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d365dfaa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d365dd190> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d365c4f80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36623710> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36622330> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d365de060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36620a40> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d366546b0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d365c4200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d36654b60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d365c6840> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d36654dd0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d365c2d20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d366554c0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36655190> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d366563c0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d366705c0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d36671d00> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36672bd0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d36673230> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36672120> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d36673c80> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d366733b0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d366563f0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d36363b30> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d3638c620> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d3638c380> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d3638c590> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d3638c7d0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36361cd0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d3638de50> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d3638cb00> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36656ae0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d363be1b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d363d6360> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36413140> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d364398e0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36413260> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d363d6ff0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36260230> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d363d53a0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d3638ed50> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f2d36260410> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_xnyxpsds/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d362c5f70> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d3629ce60> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36263fb0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d3629fdd0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d362f58b0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d362f5640> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d362f4f80> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d362f5a30> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d362c6990> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d362f65d0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d362f6810> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d362f6d50> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d3615cb60> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d3615e780> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d3615f110> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d3615ff80> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36162db0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d36162ed0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36161070> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36166bd0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d361656a0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36165400> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36167b00> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36161580> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d361aad20> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d361aaf30> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d361ac9e0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d361ac7a0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d361aeed0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d361ad0d0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d361ba6f0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d361af080> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d361bb440> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d361bb7d0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d361bba10> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d361ab0b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d361beab0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d361bfe60> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d361bd250> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d361be120> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d361bcdd0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d36048050> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36048e90> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36166ba0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36049040> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d3604b2c0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d36051a90> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d360523c0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d3604a5d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d36051160> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d360525d0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d360e6690> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d3605c2c0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36056450> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d360562a0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d360ecd70> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d35547dd0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d355483e0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d36064e90> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d360642c0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d360eec30> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d360ee780> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d3554b1d0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d3554aa80> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d3554ac60> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d35549eb0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d3554b2c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d355bddf0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d3554bdd0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d360ee8a0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d355bf770> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d355be9c0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d355ea150> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d355d9f40> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d35401be0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d355da930> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d3542f470> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d3542e150> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d3542ef30> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d35476a80> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d35474fe0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d35477380> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d35475f70> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_hostnqn": "", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-217", "ansible_nodename": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec21dae8c3a8315c7fcff8a700ae1140", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2777, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 940, "free": 2777}, "nocache": {"free": 3419, "used": 298}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_uuid": "ec21dae8-c3a8-315c-7fcf-f8a700ae1140", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1042, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251180498944, "block_size": 4096, "block_total": 64483404, "block_available": 61323364, "block_used": 3160040, "inode_total": 16384000, "inode_available": 16303424, "inode_used": 80576, "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd"}], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKNHHarzNQiKV9Fb8htkAo6V5gtUJbuBq7ufermmas6AagMSKqKyQaus7RRYNV0OV6WSVxouvjH4/8553bXF92vINMV37T3BVbSk0VjsDFFAEVkcy7KACT6upREthXzZwLKGK3O4ngGuc4tFf4pQ8aO6/f+Ohm4MzbhCTBhcqJAZAAAAFQClgsX0FPGUtboi3JLlgdUwEKs1QQAAAIBz7qRuyGTAbapZ14FtFLBd/Q0laoIT0Ng+sC/YShWSMBiBZRVJO3mNJQE7grw+G5/0xmxACjGd0+QZ+oyJeoMvQVHzKLhKNCQ5Qcli7GA0RhjCmFSxK8n8AMpfgdqAotUZ6ZM/CW7/H+Ep7tsT8jiMRjKnmn/+91PXtHzBqHvy7wAAAIBqn+Xsrfpj9UiHj75eG8gHsDD4pEVf0sY8iz5WBKk84gO63y8sEtJFcMk4z6d3sc8D+exGAETg/9GTzdTgIPSN1PiLTqVHEtlbgJ+im7iDKmVp6WGUg5p9gh8W0mmFQTtlZueefyvqpe89LjzuKwEioUAMWuj6jCnHVijuYPibng==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC1YAi1e55agg+XKOb96N2Hd6TUtxZ/7W67FkAKMTDd/JPwM9in1rbr68jzlzK4a0rCzng6JYcOJS1960MXsFkr9cKEEyRxrP+OcVVTCP1UBwwu+HeEtgzUGrkUqSozi+NM0AKc3uCoDmTWtndfQoQGBLd32f/hrMJsePHruozn79OIAbnq/odkEwUI1qi2n9hnLb1N5Fl3ftN+fbsO4xuY/yEGFk0z1aAAj7Vgd0BwnGBWIZ/SrGoijI6+YqSTBBu+/3QS+ArkKBr/GfRmxG4m4+VmBbzxjQ3VbpBtdydfkNIwD15OZRKS1cFilWjohPehP3UBvNNKlexDxvBeGPcdKQwz8VQOcbVxNj8TqQNkgfiOUDTqaKwGkLu5EbF+p40d+EpjceP/u40Mh56rEJaAMPWMkPROlGAqQt3naOhKJPg98dWS+w9gK+iW69TgJZtSqqlIoWdmJZQ0W/2R6Buf9ktgOHWYg+t5LZGP2Q6myRQWS/HxB6+hJ2WEB6pDObc=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIVCNaVFEWRPD6ZObUI3I47yORZdevoJeU4h657k6xFMv2EPlOCZq979bRxLfvVP++7xup0OeCRAJPwzE4wIsEg=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICX8RCP0XC2dyBTfIbAYFLUCYwTL55FaNzd8acASiOLe", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_local": {}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_is_chroot": false, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 55312 10.31.10.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 55312 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::bb10:9a17:6b35:7604", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8c:42:87:d8:29", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.217"], "ansible_all_ipv6_addresses": ["fe80::bb10:9a17:6b35:7604"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.217", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::bb10:9a17:6b35:7604"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.5107421875, "5m": 0.5224609375, "15m": 0.34326171875}, "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "44", "second": "48", "epoch": "1726883088", "epoch_int": "1726883088", "date": "2024-09-20", "time": "21:44:48", "iso8601_micro": "2024-09-21T01:44:48.234511Z", "iso8601": "2024-09-21T01:44:48Z", "iso8601_basic": "20240920T214448234511", "iso8601_basic_short": "20240920T214448", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node1 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 33192 1726883088.29114: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883086.6664639-33203-249175268240532/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 33192 1726883088.29117: _low_level_execute_command(): starting 33192 1726883088.29120: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883086.6664639-33203-249175268240532/ > /dev/null 2>&1 && sleep 0' 33192 1726883088.30567: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found <<< 33192 1726883088.30571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33192 1726883088.30573: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 33192 1726883088.30576: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33192 1726883088.30799: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33192 1726883088.32626: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33192 1726883088.32633: stderr chunk (state=3): >>><<< 33192 1726883088.32639: stdout chunk (state=3): >>><<< 33192 1726883088.32663: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33192 1726883088.32672: handler run complete 33192 1726883088.33180: variable 'ansible_facts' from source: unknown 33192 1726883088.33583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883088.34898: variable 'ansible_facts' from source: unknown 33192 1726883088.35288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883088.35707: attempt loop complete, returning result 33192 1726883088.36154: _execute() done 33192 1726883088.36157: dumping result to json 33192 1726883088.36201: done dumping result, returning 33192 1726883088.36211: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0affe814-3a2d-6c15-6a7e-000000000147] 33192 1726883088.36217: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000147 ok: [managed_node1] 33192 1726883088.37913: no more pending results, returning what we have 33192 1726883088.37917: results queue empty 33192 1726883088.37918: checking for any_errors_fatal 33192 1726883088.37920: done checking for any_errors_fatal 33192 1726883088.37921: checking for max_fail_percentage 33192 1726883088.37923: done checking for max_fail_percentage 33192 1726883088.37924: checking to see if all hosts have failed and the running result is not ok 33192 1726883088.37925: done checking to see if all hosts have failed 33192 1726883088.37926: getting the remaining hosts for this loop 33192 1726883088.37928: done getting the remaining hosts for this loop 33192 1726883088.37932: getting the next task for host managed_node1 33192 1726883088.37940: done getting next task for host managed_node1 33192 1726883088.37943: ^ task is: TASK: meta (flush_handlers) 33192 1726883088.38000: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883088.38006: getting variables 33192 1726883088.38008: in VariableManager get_vars() 33192 1726883088.38035: Calling all_inventory to load vars for managed_node1 33192 1726883088.38039: Calling groups_inventory to load vars for managed_node1 33192 1726883088.38043: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883088.38051: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000147 33192 1726883088.38054: WORKER PROCESS EXITING 33192 1726883088.38117: Calling all_plugins_play to load vars for managed_node1 33192 1726883088.38121: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883088.38126: Calling groups_plugins_play to load vars for managed_node1 33192 1726883088.38739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883088.39423: done with get_vars() 33192 1726883088.39437: done getting variables 33192 1726883088.39569: in VariableManager get_vars() 33192 1726883088.39637: Calling all_inventory to load vars for managed_node1 33192 1726883088.39640: Calling groups_inventory to load vars for managed_node1 33192 1726883088.39644: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883088.39649: Calling all_plugins_play to load vars for managed_node1 33192 1726883088.39652: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883088.39656: Calling groups_plugins_play to load vars for managed_node1 33192 1726883088.40115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883088.40975: done with get_vars() 33192 1726883088.40992: done queuing things up, now waiting for results queue to drain 33192 1726883088.40995: results queue empty 33192 1726883088.40996: checking for any_errors_fatal 33192 1726883088.40999: done checking for any_errors_fatal 33192 1726883088.41000: checking for max_fail_percentage 33192 1726883088.41001: done checking for max_fail_percentage 33192 1726883088.41002: checking to see if all hosts have failed and the running result is not ok 33192 1726883088.41003: done checking to see if all hosts have failed 33192 1726883088.41008: getting the remaining hosts for this loop 33192 1726883088.41010: done getting the remaining hosts for this loop 33192 1726883088.41013: getting the next task for host managed_node1 33192 1726883088.41018: done getting next task for host managed_node1 33192 1726883088.41022: ^ task is: TASK: Include the task 'el_repo_setup.yml' 33192 1726883088.41024: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883088.41026: getting variables 33192 1726883088.41028: in VariableManager get_vars() 33192 1726883088.41170: Calling all_inventory to load vars for managed_node1 33192 1726883088.41176: Calling groups_inventory to load vars for managed_node1 33192 1726883088.41179: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883088.41185: Calling all_plugins_play to load vars for managed_node1 33192 1726883088.41188: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883088.41191: Calling groups_plugins_play to load vars for managed_node1 33192 1726883088.41587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883088.42195: done with get_vars() 33192 1726883088.42205: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:11 Friday 20 September 2024 21:44:48 -0400 (0:00:01.807) 0:00:01.823 ****** 33192 1726883088.42306: entering _queue_task() for managed_node1/include_tasks 33192 1726883088.42309: Creating lock for include_tasks 33192 1726883088.42677: worker is 1 (out of 1 available) 33192 1726883088.42690: exiting _queue_task() for managed_node1/include_tasks 33192 1726883088.42817: done queuing things up, now waiting for results queue to drain 33192 1726883088.42820: waiting for pending results... 33192 1726883088.42977: running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' 33192 1726883088.43096: in run() - task 0affe814-3a2d-6c15-6a7e-000000000006 33192 1726883088.43116: variable 'ansible_search_path' from source: unknown 33192 1726883088.43174: calling self._execute() 33192 1726883088.43274: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883088.43298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883088.43315: variable 'omit' from source: magic vars 33192 1726883088.43451: _execute() done 33192 1726883088.43466: dumping result to json 33192 1726883088.43480: done dumping result, returning 33192 1726883088.43492: done running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' [0affe814-3a2d-6c15-6a7e-000000000006] 33192 1726883088.43506: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000006 33192 1726883088.43751: no more pending results, returning what we have 33192 1726883088.43757: in VariableManager get_vars() 33192 1726883088.43846: Calling all_inventory to load vars for managed_node1 33192 1726883088.43850: Calling groups_inventory to load vars for managed_node1 33192 1726883088.43854: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883088.43870: Calling all_plugins_play to load vars for managed_node1 33192 1726883088.43877: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883088.43881: Calling groups_plugins_play to load vars for managed_node1 33192 1726883088.44286: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000006 33192 1726883088.44289: WORKER PROCESS EXITING 33192 1726883088.44315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883088.44648: done with get_vars() 33192 1726883088.44661: variable 'ansible_search_path' from source: unknown 33192 1726883088.44678: we have included files to process 33192 1726883088.44679: generating all_blocks data 33192 1726883088.44680: done generating all_blocks data 33192 1726883088.44681: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 33192 1726883088.44683: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 33192 1726883088.44686: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 33192 1726883088.45604: in VariableManager get_vars() 33192 1726883088.45624: done with get_vars() 33192 1726883088.45647: done processing included file 33192 1726883088.45650: iterating over new_blocks loaded from include file 33192 1726883088.45652: in VariableManager get_vars() 33192 1726883088.45665: done with get_vars() 33192 1726883088.45667: filtering new block on tags 33192 1726883088.45695: done filtering new block on tags 33192 1726883088.45699: in VariableManager get_vars() 33192 1726883088.45713: done with get_vars() 33192 1726883088.45715: filtering new block on tags 33192 1726883088.45738: done filtering new block on tags 33192 1726883088.45767: in VariableManager get_vars() 33192 1726883088.45784: done with get_vars() 33192 1726883088.45786: filtering new block on tags 33192 1726883088.45809: done filtering new block on tags 33192 1726883088.45812: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node1 33192 1726883088.45818: extending task lists for all hosts with included blocks 33192 1726883088.45889: done extending task lists 33192 1726883088.45890: done processing included files 33192 1726883088.45891: results queue empty 33192 1726883088.45892: checking for any_errors_fatal 33192 1726883088.45894: done checking for any_errors_fatal 33192 1726883088.45894: checking for max_fail_percentage 33192 1726883088.45896: done checking for max_fail_percentage 33192 1726883088.45897: checking to see if all hosts have failed and the running result is not ok 33192 1726883088.45898: done checking to see if all hosts have failed 33192 1726883088.45899: getting the remaining hosts for this loop 33192 1726883088.45905: done getting the remaining hosts for this loop 33192 1726883088.45909: getting the next task for host managed_node1 33192 1726883088.45913: done getting next task for host managed_node1 33192 1726883088.45916: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 33192 1726883088.45919: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883088.45921: getting variables 33192 1726883088.45922: in VariableManager get_vars() 33192 1726883088.45931: Calling all_inventory to load vars for managed_node1 33192 1726883088.45936: Calling groups_inventory to load vars for managed_node1 33192 1726883088.45939: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883088.45945: Calling all_plugins_play to load vars for managed_node1 33192 1726883088.45948: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883088.45952: Calling groups_plugins_play to load vars for managed_node1 33192 1726883088.46198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883088.46542: done with get_vars() 33192 1726883088.46552: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 21:44:48 -0400 (0:00:00.043) 0:00:01.866 ****** 33192 1726883088.46638: entering _queue_task() for managed_node1/setup 33192 1726883088.46981: worker is 1 (out of 1 available) 33192 1726883088.46996: exiting _queue_task() for managed_node1/setup 33192 1726883088.47009: done queuing things up, now waiting for results queue to drain 33192 1726883088.47011: waiting for pending results... 33192 1726883088.47475: running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 33192 1726883088.47841: in run() - task 0affe814-3a2d-6c15-6a7e-000000000158 33192 1726883088.47846: variable 'ansible_search_path' from source: unknown 33192 1726883088.47849: variable 'ansible_search_path' from source: unknown 33192 1726883088.47969: calling self._execute() 33192 1726883088.48089: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883088.48298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883088.48301: variable 'omit' from source: magic vars 33192 1726883088.49666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 33192 1726883088.53815: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 33192 1726883088.53878: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 33192 1726883088.53906: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 33192 1726883088.53939: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 33192 1726883088.53964: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 33192 1726883088.54191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33192 1726883088.54214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33192 1726883088.54238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33192 1726883088.54277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33192 1726883088.54287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33192 1726883088.54428: variable 'ansible_facts' from source: unknown 33192 1726883088.54493: variable 'network_test_required_facts' from source: task vars 33192 1726883088.54520: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 33192 1726883088.54526: variable 'omit' from source: magic vars 33192 1726883088.54562: variable 'omit' from source: magic vars 33192 1726883088.54590: variable 'omit' from source: magic vars 33192 1726883088.54615: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33192 1726883088.54638: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33192 1726883088.54656: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33192 1726883088.54675: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33192 1726883088.54683: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33192 1726883088.54710: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33192 1726883088.54714: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883088.54735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883088.54822: Set connection var ansible_shell_type to sh 33192 1726883088.54826: Set connection var ansible_connection to ssh 33192 1726883088.54938: Set connection var ansible_timeout to 10 33192 1726883088.54942: Set connection var ansible_module_compression to ZIP_DEFLATED 33192 1726883088.54945: Set connection var ansible_pipelining to False 33192 1726883088.54947: Set connection var ansible_shell_executable to /bin/sh 33192 1726883088.54950: variable 'ansible_shell_executable' from source: unknown 33192 1726883088.54953: variable 'ansible_connection' from source: unknown 33192 1726883088.54955: variable 'ansible_module_compression' from source: unknown 33192 1726883088.54957: variable 'ansible_shell_type' from source: unknown 33192 1726883088.54959: variable 'ansible_shell_executable' from source: unknown 33192 1726883088.54962: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883088.54964: variable 'ansible_pipelining' from source: unknown 33192 1726883088.54966: variable 'ansible_timeout' from source: unknown 33192 1726883088.54968: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883088.55083: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 33192 1726883088.55097: variable 'omit' from source: magic vars 33192 1726883088.55100: starting attempt loop 33192 1726883088.55103: running the handler 33192 1726883088.55127: _low_level_execute_command(): starting 33192 1726883088.55130: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 33192 1726883088.56125: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 33192 1726883088.56128: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33192 1726883088.56131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33192 1726883088.56133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33192 1726883088.56138: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 <<< 33192 1726883088.56140: stderr chunk (state=3): >>>debug2: match not found <<< 33192 1726883088.56142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33192 1726883088.56144: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33192 1726883088.56147: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.217 is address <<< 33192 1726883088.56148: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33192 1726883088.56150: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33192 1726883088.56152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33192 1726883088.56201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 33192 1726883088.56205: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33192 1726883088.56287: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 3 <<< 33192 1726883088.58707: stdout chunk (state=3): >>>/root <<< 33192 1726883088.59039: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33192 1726883088.59043: stdout chunk (state=3): >>><<< 33192 1726883088.59045: stderr chunk (state=3): >>><<< 33192 1726883088.59048: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 3 debug2: Received exit status from master 0 33192 1726883088.59050: _low_level_execute_command(): starting 33192 1726883088.59053: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883088.5897825-33259-9361258256583 `" && echo ansible-tmp-1726883088.5897825-33259-9361258256583="` echo /root/.ansible/tmp/ansible-tmp-1726883088.5897825-33259-9361258256583 `" ) && sleep 0' 33192 1726883088.59697: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 33192 1726883088.59842: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33192 1726883088.59845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33192 1726883088.59848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33192 1726883088.59850: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 <<< 33192 1726883088.59852: stderr chunk (state=3): >>>debug2: match not found <<< 33192 1726883088.59854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33192 1726883088.59857: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33192 1726883088.59859: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.217 is address <<< 33192 1726883088.59862: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33192 1726883088.59889: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33192 1726883088.59920: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33192 1726883088.59932: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33192 1726883088.60138: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33192 1726883088.61998: stdout chunk (state=3): >>>ansible-tmp-1726883088.5897825-33259-9361258256583=/root/.ansible/tmp/ansible-tmp-1726883088.5897825-33259-9361258256583 <<< 33192 1726883088.62138: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33192 1726883088.62198: stderr chunk (state=3): >>><<< 33192 1726883088.62229: stdout chunk (state=3): >>><<< 33192 1726883088.62273: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883088.5897825-33259-9361258256583=/root/.ansible/tmp/ansible-tmp-1726883088.5897825-33259-9361258256583 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33192 1726883088.62328: variable 'ansible_module_compression' from source: unknown 33192 1726883088.62385: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-33192zxvjc6ee/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 33192 1726883088.62473: variable 'ansible_facts' from source: unknown 33192 1726883088.62857: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883088.5897825-33259-9361258256583/AnsiballZ_setup.py 33192 1726883088.63301: Sending initial data 33192 1726883088.63305: Sent initial data (152 bytes) 33192 1726883088.64599: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 33192 1726883088.64603: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33192 1726883088.64607: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33192 1726883088.64610: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33192 1726883088.64642: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33192 1726883088.64771: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33192 1726883088.66367: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 33192 1726883088.66445: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 33192 1726883088.66524: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-33192zxvjc6ee/tmphvu6gs_f /root/.ansible/tmp/ansible-tmp-1726883088.5897825-33259-9361258256583/AnsiballZ_setup.py <<< 33192 1726883088.66530: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883088.5897825-33259-9361258256583/AnsiballZ_setup.py" <<< 33192 1726883088.66614: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-33192zxvjc6ee/tmphvu6gs_f" to remote "/root/.ansible/tmp/ansible-tmp-1726883088.5897825-33259-9361258256583/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883088.5897825-33259-9361258256583/AnsiballZ_setup.py" <<< 33192 1726883088.69570: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33192 1726883088.69611: stderr chunk (state=3): >>><<< 33192 1726883088.69615: stdout chunk (state=3): >>><<< 33192 1726883088.69678: done transferring module to remote 33192 1726883088.69683: _low_level_execute_command(): starting 33192 1726883088.69686: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883088.5897825-33259-9361258256583/ /root/.ansible/tmp/ansible-tmp-1726883088.5897825-33259-9361258256583/AnsiballZ_setup.py && sleep 0' 33192 1726883088.70358: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 33192 1726883088.70449: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33192 1726883088.70539: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33192 1726883088.70542: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33192 1726883088.70545: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33192 1726883088.70604: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33192 1726883088.72562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33192 1726883088.72566: stdout chunk (state=3): >>><<< 33192 1726883088.72568: stderr chunk (state=3): >>><<< 33192 1726883088.72693: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33192 1726883088.72697: _low_level_execute_command(): starting 33192 1726883088.72700: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883088.5897825-33259-9361258256583/AnsiballZ_setup.py && sleep 0' 33192 1726883088.73351: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33192 1726883088.73418: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33192 1726883088.73438: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33192 1726883088.73482: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33192 1726883088.73569: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33192 1726883088.75960: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 33192 1726883088.76048: stdout chunk (state=3): >>>import _imp # builtin import '_thread' # <<< 33192 1726883088.76169: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # import '_io' # <<< 33192 1726883088.76173: stdout chunk (state=3): >>>import 'marshal' # <<< 33192 1726883088.76221: stdout chunk (state=3): >>>import 'posix' # <<< 33192 1726883088.76276: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 33192 1726883088.76280: stdout chunk (state=3): >>> # installing zipimport hook <<< 33192 1726883088.76290: stdout chunk (state=3): >>>import 'time' # <<< 33192 1726883088.76306: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 33192 1726883088.76393: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 33192 1726883088.76397: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 33192 1726883088.76405: stdout chunk (state=3): >>>import '_codecs' # <<< 33192 1726883088.76468: stdout chunk (state=3): >>>import 'codecs' # <<< 33192 1726883088.76486: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 33192 1726883088.76542: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cf2c530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cefbb30> <<< 33192 1726883088.76573: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cf2eab0> <<< 33192 1726883088.76631: stdout chunk (state=3): >>>import '_signal' # import '_abc' # import 'abc' # <<< 33192 1726883088.76659: stdout chunk (state=3): >>>import 'io' # <<< 33192 1726883088.76712: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 33192 1726883088.76877: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # import 'posixpath' # <<< 33192 1726883088.76912: stdout chunk (state=3): >>>import 'os' # <<< 33192 1726883088.76961: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages <<< 33192 1726883088.76976: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 33192 1726883088.77017: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 33192 1726883088.77052: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974ccfd160> <<< 33192 1726883088.77143: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 33192 1726883088.77193: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974ccfdfd0> import 'site' # <<< 33192 1726883088.77220: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 7 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 33192 1726883088.77982: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc'<<< 33192 1726883088.78017: stdout chunk (state=3): >>> # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py<<< 33192 1726883088.78025: stdout chunk (state=3): >>> <<< 33192 1726883088.78047: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc'<<< 33192 1726883088.78086: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 33192 1726883088.78161: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc'<<< 33192 1726883088.78196: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py<<< 33192 1726883088.78204: stdout chunk (state=3): >>> <<< 33192 1726883088.78268: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cd3be90><<< 33192 1726883088.78304: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py<<< 33192 1726883088.78310: stdout chunk (state=3): >>> <<< 33192 1726883088.78342: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 33192 1726883088.78411: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cd3bf50> <<< 33192 1726883088.78417: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 33192 1726883088.78510: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 33192 1726883088.78601: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 33192 1726883088.78604: stdout chunk (state=3): >>>import 'itertools' # <<< 33192 1726883088.78632: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cd738c0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 33192 1726883088.78680: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cd73f50> import '_collections' # <<< 33192 1726883088.78731: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cd53b60> <<< 33192 1726883088.78753: stdout chunk (state=3): >>>import '_functools' # <<< 33192 1726883088.78769: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cd51280> <<< 33192 1726883088.78868: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cd39040> <<< 33192 1726883088.78913: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 33192 1726883088.78954: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 33192 1726883088.78968: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 33192 1726883088.79016: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 33192 1726883088.79024: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 33192 1726883088.79050: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cd97800> <<< 33192 1726883088.79086: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cd96420> <<< 33192 1726883088.79100: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cd52150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cd94cb0> <<< 33192 1726883088.79164: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 33192 1726883088.79191: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cdc8860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cd382c0> <<< 33192 1726883088.79209: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 33192 1726883088.79247: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974cdc8d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cdc8bc0> <<< 33192 1726883088.79292: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974cdc8f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cd36de0> <<< 33192 1726883088.79330: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 33192 1726883088.79389: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 33192 1726883088.79406: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cdc9640> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cdc9310> <<< 33192 1726883088.79433: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 33192 1726883088.79491: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 33192 1726883088.79495: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cdca4e0> <<< 33192 1726883088.79522: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 33192 1726883088.79525: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 33192 1726883088.79551: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 33192 1726883088.79590: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 33192 1726883088.79613: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cde0710> <<< 33192 1726883088.79616: stdout chunk (state=3): >>>import 'errno' # <<< 33192 1726883088.79850: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 33192 1726883088.79887: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974cde1df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cde2cc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974cde3320> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cde2240> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974cde3da0> <<< 33192 1726883088.79893: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cde34d0> <<< 33192 1726883088.79935: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cdca540> <<< 33192 1726883088.80043: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 33192 1726883088.80075: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 33192 1726883088.80085: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974cadfc80> <<< 33192 1726883088.80099: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py <<< 33192 1726883088.80105: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 33192 1726883088.80150: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 33192 1726883088.80163: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974cb087a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cb08500> <<< 33192 1726883088.80176: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 33192 1726883088.80182: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974cb08710> <<< 33192 1726883088.80214: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' <<< 33192 1726883088.80221: stdout chunk (state=3): >>># extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974cb08980> <<< 33192 1726883088.80264: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cadde20> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 33192 1726883088.80457: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 33192 1726883088.80479: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 33192 1726883088.80489: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cb0a090> <<< 33192 1726883088.80540: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cb08d10> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cdcac30> <<< 33192 1726883088.80573: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 33192 1726883088.80648: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 33192 1726883088.80738: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 33192 1726883088.80776: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cb36420> <<< 33192 1726883088.80847: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 33192 1726883088.80862: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 33192 1726883088.80892: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 33192 1726883088.80920: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 33192 1726883088.80992: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cb52510> <<< 33192 1726883088.81020: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 33192 1726883088.81077: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 33192 1726883088.81169: stdout chunk (state=3): >>>import 'ntpath' # <<< 33192 1726883088.81220: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cb8b290> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 33192 1726883088.81279: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 33192 1726883088.81305: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 33192 1726883088.81380: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 33192 1726883088.81522: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cbb5a30> <<< 33192 1726883088.81631: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cb8b3b0> <<< 33192 1726883088.81692: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cb531a0> <<< 33192 1726883088.81731: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c9d8380> <<< 33192 1726883088.81837: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cb51550> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cb0afc0> <<< 33192 1726883088.82016: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 33192 1726883088.82051: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f974c9d8560> <<< 33192 1726883088.82364: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_s_7iimly/ansible_setup_payload.zip' # zipimport: zlib available <<< 33192 1726883088.82611: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883088.82666: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 33192 1726883088.82841: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 33192 1726883088.82847: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 33192 1726883088.82888: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974ca41f10> <<< 33192 1726883088.82894: stdout chunk (state=3): >>>import '_typing' # <<< 33192 1726883088.83210: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974ca18e00> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c9dbf20> <<< 33192 1726883088.83214: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883088.83260: stdout chunk (state=3): >>>import 'ansible' # <<< 33192 1726883088.83273: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883088.83284: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883088.83315: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883088.83321: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 33192 1726883088.83348: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883088.85877: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883088.87483: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 33192 1726883088.87489: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 33192 1726883088.87765: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974ca1bd70> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974ca718b0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974ca71640> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974ca70f50> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 33192 1726883088.87963: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974ca716a0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974ca42930> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974ca72600> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974ca72840> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 33192 1726883088.88055: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974ca72d80> import 'pwd' # <<< 33192 1726883088.88058: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 33192 1726883088.88094: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 33192 1726883088.88142: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c8d8b00> <<< 33192 1726883088.88185: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974c8da720> <<< 33192 1726883088.88229: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 33192 1726883088.88233: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 33192 1726883088.88294: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c8db050> <<< 33192 1726883088.88323: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 33192 1726883088.88356: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 33192 1726883088.88373: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c8dbf50> <<< 33192 1726883088.88392: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 33192 1726883088.88465: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 33192 1726883088.88473: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 33192 1726883088.88627: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c8dec00> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974c8def30> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c8dcdd0> <<< 33192 1726883088.88663: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 33192 1726883088.88713: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 33192 1726883088.88715: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 33192 1726883088.88741: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 33192 1726883088.88807: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 33192 1726883088.88827: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c8e2c60> <<< 33192 1726883088.88905: stdout chunk (state=3): >>>import '_tokenize' # <<< 33192 1726883088.88947: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c8e1760> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c8e14c0> <<< 33192 1726883088.88976: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 33192 1726883088.88986: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 33192 1726883088.89116: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c8e3f50> <<< 33192 1726883088.89151: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c8dd3d0> <<< 33192 1726883088.89183: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974c926d80> <<< 33192 1726883088.89237: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c926f00> <<< 33192 1726883088.89255: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 33192 1726883088.89276: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 33192 1726883088.89340: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 33192 1726883088.89353: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 33192 1726883088.89453: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974c928ad0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c928890> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 33192 1726883088.89638: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 33192 1726883088.89663: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974c92b080><<< 33192 1726883088.89683: stdout chunk (state=3): >>> <<< 33192 1726883088.89689: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c9291c0><<< 33192 1726883088.89736: stdout chunk (state=3): >>> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 33192 1726883088.89819: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 33192 1726883088.89860: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 33192 1726883088.89886: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc'<<< 33192 1726883088.89912: stdout chunk (state=3): >>> import '_string' # <<< 33192 1726883088.89918: stdout chunk (state=3): >>> <<< 33192 1726883088.90032: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c936870> <<< 33192 1726883088.90292: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c92b200> <<< 33192 1726883088.90403: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 33192 1726883088.90446: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974c937b90> <<< 33192 1726883088.90469: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so'<<< 33192 1726883088.90491: stdout chunk (state=3): >>> # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974c937590><<< 33192 1726883088.90558: stdout chunk (state=3): >>> <<< 33192 1726883088.90599: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974c937cb0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c927200> <<< 33192 1726883088.90664: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 33192 1726883088.90719: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 33192 1726883088.90722: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 33192 1726883088.90777: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974c93b320> <<< 33192 1726883088.91176: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974c93c4a0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c939a90> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974c93ae10> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c939670> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 33192 1726883088.91424: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883088.91910: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available <<< 33192 1726883088.92077: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883088.92456: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883088.93240: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 33192 1726883088.93329: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974c7c4650> <<< 33192 1726883088.93406: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 33192 1726883088.93426: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c7c5400> <<< 33192 1726883088.93544: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c93f9b0> <<< 33192 1726883088.93549: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 33192 1726883088.93555: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883088.93558: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883088.93561: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 33192 1726883088.93563: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883088.94028: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 33192 1726883088.94047: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 33192 1726883088.94066: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c7c54f0> # zipimport: zlib available <<< 33192 1726883088.94992: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883088.95994: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883088.96122: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883088.96281: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 33192 1726883088.96285: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883088.96338: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883088.96405: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 33192 1726883088.96448: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883088.96687: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883088.96691: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 33192 1726883088.96693: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883088.96883: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883088.96887: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 33192 1726883088.96889: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883088.97075: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883088.97358: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 33192 1726883088.97434: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 33192 1726883088.97484: stdout chunk (state=3): >>>import '_ast' # <<< 33192 1726883088.97643: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c7c7bc0> # zipimport: zlib available <<< 33192 1726883088.97681: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883088.97720: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # <<< 33192 1726883088.97747: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 33192 1726883088.97775: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 33192 1726883088.97797: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 33192 1726883088.97853: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 33192 1726883088.98353: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974c7ce0f0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974c7cea50> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c7c6c30> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 33192 1726883088.98432: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883088.98539: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 33192 1726883088.98602: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 33192 1726883088.98730: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974c7cd760> <<< 33192 1726883088.98792: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c7cecc0> <<< 33192 1726883088.98853: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 33192 1726883088.98856: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883088.98954: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883088.99053: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883088.99094: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883088.99154: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 33192 1726883088.99181: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 33192 1726883088.99228: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 33192 1726883088.99247: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 33192 1726883088.99326: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 33192 1726883088.99369: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 33192 1726883088.99372: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 33192 1726883088.99467: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c85ee10> <<< 33192 1726883088.99556: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c7dbc80> <<< 33192 1726883088.99677: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c7d2c60> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c7d2a50> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 33192 1726883088.99681: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883088.99709: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883088.99755: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 33192 1726883088.99867: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 33192 1726883088.99928: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883088.99980: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.00100: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.00109: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.00130: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.00196: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.00277: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.00316: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.00374: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 33192 1726883089.00508: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.00816: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 33192 1726883089.01038: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.01410: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 33192 1726883089.01484: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 33192 1726883089.01513: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 33192 1726883089.01586: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 33192 1726883089.01609: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 33192 1726883089.01700: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c8659a0> <<< 33192 1726883089.01714: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 33192 1726883089.01817: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974bd483e0> <<< 33192 1726883089.01861: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974bd489b0> <<< 33192 1726883089.01962: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c84d460> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c84c770> <<< 33192 1726883089.02243: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c8640b0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c8679e0> <<< 33192 1726883089.02247: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 33192 1726883089.02281: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974bd4b6e0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974bd4af90> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974bd4b170> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974bd4a3c0> <<< 33192 1726883089.02292: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 33192 1726883089.02453: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974bd4b800> <<< 33192 1726883089.02483: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 33192 1726883089.02851: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974bdb2300> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974bdb0320> <<< 33192 1726883089.02974: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c867bf0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available <<< 33192 1726883089.02994: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.03066: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 33192 1726883089.03122: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 33192 1726883089.03205: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 33192 1726883089.03305: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.03353: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 33192 1726883089.03437: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.03488: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 33192 1726883089.03579: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.03759: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 33192 1726883089.03857: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 33192 1726883089.03991: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.04774: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.05561: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 33192 1726883089.05577: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.05658: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.05746: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.05790: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.05842: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 33192 1726883089.05944: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 33192 1726883089.06112: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # <<< 33192 1726883089.06138: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.06182: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.06226: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 33192 1726883089.06229: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.06269: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.06319: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 33192 1726883089.06448: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.06598: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 33192 1726883089.06601: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 33192 1726883089.06635: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974bdb2660> <<< 33192 1726883089.06691: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 33192 1726883089.06913: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974bdb3230> import 'ansible.module_utils.facts.system.local' # <<< 33192 1726883089.06927: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.07039: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.07155: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 33192 1726883089.07305: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.07476: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 33192 1726883089.07580: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.07715: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 33192 1726883089.07839: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 33192 1726883089.07896: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 33192 1726883089.07991: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 33192 1726883089.08094: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974bdea6c0> <<< 33192 1726883089.08439: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974bdd63c0> import 'ansible.module_utils.facts.system.python' # <<< 33192 1726883089.08457: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.08542: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.08625: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 33192 1726883089.08647: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.08843: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.08917: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.09108: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.09373: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 33192 1726883089.09386: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.09435: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.09503: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 33192 1726883089.09519: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.09560: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.09641: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 33192 1726883089.09708: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974bc01ca0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974bc01880> import 'ansible.module_utils.facts.system.user' # <<< 33192 1726883089.09749: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 33192 1726883089.09822: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 33192 1726883089.09889: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 33192 1726883089.09892: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.10164: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.10423: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 33192 1726883089.10441: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.10601: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.10828: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 33192 1726883089.10894: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 33192 1726883089.10913: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 33192 1726883089.11139: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 33192 1726883089.11202: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.11463: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # <<< 33192 1726883089.11479: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 33192 1726883089.11738: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.11905: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 33192 1726883089.11924: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.11957: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.12004: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.13015: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.13943: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 33192 1726883089.13967: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # <<< 33192 1726883089.14149: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 33192 1726883089.14317: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 33192 1726883089.14345: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.14495: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.14672: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 33192 1726883089.14697: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.14946: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.15212: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 33192 1726883089.15256: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.15259: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.15262: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network' # <<< 33192 1726883089.15279: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.15330: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.15382: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 33192 1726883089.15399: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.15563: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.15736: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.16111: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.16478: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 33192 1726883089.16502: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.16556: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.16624: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 33192 1726883089.16627: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.16656: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.16702: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 33192 1726883089.16819: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.16943: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available <<< 33192 1726883089.17002: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # <<< 33192 1726883089.17026: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.17100: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.17194: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 33192 1726883089.17240: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.17296: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.17390: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 33192 1726883089.17402: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.18041: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.18360: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 33192 1726883089.18453: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.18546: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 33192 1726883089.18559: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.18602: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.18655: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 33192 1726883089.18667: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.18711: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.18768: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 33192 1726883089.18774: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.18811: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.18870: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 33192 1726883089.18877: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.19002: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.19175: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 33192 1726883089.19179: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.19190: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 33192 1726883089.19249: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.19312: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 33192 1726883089.19341: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.19367: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.19393: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.19454: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.19645: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 33192 1726883089.19776: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 33192 1726883089.19799: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.19866: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.19950: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 33192 1726883089.19962: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.20307: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.20667: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 33192 1726883089.20675: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.20736: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.20827: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 33192 1726883089.20830: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.20891: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.20958: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 33192 1726883089.21046: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.21106: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.21252: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 33192 1726883089.21280: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.21406: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.21555: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 33192 1726883089.21677: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.22034: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 33192 1726883089.22056: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974bc2e660> <<< 33192 1726883089.22140: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974bc2ff50> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974bc2be00> <<< 33192 1726883089.24256: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-217", "ansible_nodename": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec21dae8c3a8315c7fcff8a700ae1140", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKNHHarzNQiKV9Fb8htkAo6V5gtUJbuBq7ufermmas6AagMSKqKyQaus7RRYNV0OV6WSVxouvjH4/8553bXF92vINMV37T3BVbSk0VjsDFFAEVkcy7KACT6upREthXzZwLKGK3O4ngGuc4tFf4pQ8aO6/f+Ohm4MzbhCTBhcqJAZAAAAFQClgsX0FPGUtboi3JLlgdUwEKs1QQAAAIBz7qRuyGTAbapZ14FtFLBd/Q0laoIT0Ng+sC/YShWSMBiBZRVJO3mNJQE7grw+G5/0xmxACjGd0+QZ+oyJeoMvQVHzKLhKNCQ5Qcli7GA0RhjCmFSxK8n8AMpfgdqAotUZ6ZM/CW7/H+Ep7tsT8jiMRjKnmn/+91PXtHzBqHvy7wAAAIBqn+Xsrfpj9UiHj75eG8gHsDD4pEVf0sY8iz5WBKk84gO63y8sEtJFcMk4z6d3sc8D+exGAETg/9GTzdTgIPSN1PiLTqVHEtlbgJ+im7iDKmVp6WGUg5p9gh8W0mmFQTtlZueefyvqpe89LjzuKwEioUAMWuj6jCnHVijuYPibng==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC1YAi1e55agg+XKOb96N2Hd6TUtxZ/7W67FkAKMTDd/JPwM9in1rbr68jzlzK4a0rCzng6JYcOJS1960MXsFkr9cKEEyRxrP+OcVVTCP1UBwwu+HeEtgzUGrkUqSozi+NM0AKc3uCoDmTWtndfQoQGBLd32f/hrMJsePHruozn79OIAbnq/odkEwUI1qi2n9hnLb1N5Fl3ftN+fbsO4xuY/yEGFk0z1aAAj7Vgd0BwnGBWIZ/SrGoijI6+YqSTBBu+/3QS+ArkKBr/GfRmxG4m4+VmBbzxjQ3VbpBtdydfkNIwD15OZRKS1cFilWjohPehP3UBvNNKlexDxvBeGPcdKQwz8VQOcbVxNj8TqQNkgfiOUDTqaKwGkLu5EbF+p40d+EpjceP/u40Mh56rEJaAMPWMkPROlGAqQt3naOhKJPg98dWS+w9gK+iW69TgJZtSqqlIoWdmJZQ0W/2R6Buf9ktgOHWYg+t5LZGP2Q6myRQWS/HxB6+hJ2WEB6pDObc=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIVCNaVFEWRPD6ZObUI3I47yORZdevoJeU4h657k6xFMv2EPlOCZq979bRxLfvVP++7xup0OeCRAJPwzE4wIsEg=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICX8RCP0XC2dyBTfIbAYFLUCYwTL55FaNzd8acASiOLe", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "44", "second": "49", "epoch": "1726883089", "epoch_int": "1726883089", "date": "2024-09-20", "time": "21:44:49", "iso8601_micro": "2024-09-21T01:44:49.239233Z", "iso8601": "2024-09-21T01:44:49Z", "iso8601_basic": "20240920T214449239233", "iso8601_basic_short": "20240920T214449", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 55312 10.31.10.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 55312 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 33192 1726883089.25017: stdout chunk (state=3): >>># clear sys.path_importer_cache<<< 33192 1726883089.25056: stdout chunk (state=3): >>> # clear sys.path_hooks<<< 33192 1726883089.25060: stdout chunk (state=3): >>> # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1<<< 33192 1726883089.25073: stdout chunk (state=3): >>> # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value<<< 33192 1726883089.25100: stdout chunk (state=3): >>> # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout<<< 33192 1726883089.25283: stdout chunk (state=3): >>> # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] rem<<< 33192 1726883089.25307: stdout chunk (state=3): >>>oving signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips <<< 33192 1726883089.25347: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local<<< 33192 1726883089.25381: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.system.lsb <<< 33192 1726883089.25393: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform<<< 33192 1726883089.25430: stdout chunk (state=3): >>> # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python <<< 33192 1726883089.25669: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy an<<< 33192 1726883089.25682: stdout chunk (state=3): >>>sible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 33192 1726883089.26004: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 33192 1726883089.26036: stdout chunk (state=3): >>># destroy importlib.machinery <<< 33192 1726883089.26075: stdout chunk (state=3): >>># destroy importlib._abc # destroy importlib.util<<< 33192 1726883089.26078: stdout chunk (state=3): >>> <<< 33192 1726883089.26150: stdout chunk (state=3): >>># destroy _bz2 <<< 33192 1726883089.26153: stdout chunk (state=3): >>># destroy _compression<<< 33192 1726883089.26157: stdout chunk (state=3): >>> <<< 33192 1726883089.26160: stdout chunk (state=3): >>># destroy _lzma<<< 33192 1726883089.26162: stdout chunk (state=3): >>> <<< 33192 1726883089.26214: stdout chunk (state=3): >>># destroy binascii # destroy zlib <<< 33192 1726883089.26218: stdout chunk (state=3): >>># destroy bz2 <<< 33192 1726883089.26221: stdout chunk (state=3): >>># destroy lzma <<< 33192 1726883089.26248: stdout chunk (state=3): >>># destroy zipfile._path <<< 33192 1726883089.26312: stdout chunk (state=3): >>># destroy zipfile<<< 33192 1726883089.26316: stdout chunk (state=3): >>> # destroy pathlib # destroy zipfile._path.glob<<< 33192 1726883089.26318: stdout chunk (state=3): >>> # destroy ipaddress<<< 33192 1726883089.26402: stdout chunk (state=3): >>> # destroy ntpath <<< 33192 1726883089.26405: stdout chunk (state=3): >>># destroy importlib <<< 33192 1726883089.26463: stdout chunk (state=3): >>># destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib<<< 33192 1726883089.26467: stdout chunk (state=3): >>> # destroy json.decoder # destroy json.encoder <<< 33192 1726883089.26509: stdout chunk (state=3): >>># destroy json.scanner # destroy _json # destroy grp<<< 33192 1726883089.26538: stdout chunk (state=3): >>> # destroy encodings # destroy _locale<<< 33192 1726883089.26541: stdout chunk (state=3): >>> # destroy locale<<< 33192 1726883089.26585: stdout chunk (state=3): >>> # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog<<< 33192 1726883089.26588: stdout chunk (state=3): >>> # destroy uuid<<< 33192 1726883089.26686: stdout chunk (state=3): >>> # destroy _hashlib <<< 33192 1726883089.26689: stdout chunk (state=3): >>># destroy _blake2 # destroy selinux <<< 33192 1726883089.26748: stdout chunk (state=3): >>># destroy shutil<<< 33192 1726883089.26751: stdout chunk (state=3): >>> # destroy distro # destroy distro.distro<<< 33192 1726883089.26763: stdout chunk (state=3): >>> # destroy argparse # destroy logging <<< 33192 1726883089.26817: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing<<< 33192 1726883089.26844: stdout chunk (state=3): >>> # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array<<< 33192 1726883089.26868: stdout chunk (state=3): >>> # destroy _compat_pickle # destroy _pickle<<< 33192 1726883089.26912: stdout chunk (state=3): >>> # destroy queue # destroy _heapq<<< 33192 1726883089.26949: stdout chunk (state=3): >>> # destroy _queue # destroy multiprocessing.process <<< 33192 1726883089.26952: stdout chunk (state=3): >>># destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction<<< 33192 1726883089.27004: stdout chunk (state=3): >>> # destroy selectors # destroy _multiprocessing # destroy shlex <<< 33192 1726883089.27007: stdout chunk (state=3): >>># destroy fcntl<<< 33192 1726883089.27057: stdout chunk (state=3): >>> # destroy datetime<<< 33192 1726883089.27076: stdout chunk (state=3): >>> # destroy subprocess # destroy base64<<< 33192 1726883089.27088: stdout chunk (state=3): >>> # destroy _ssl <<< 33192 1726883089.27143: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux<<< 33192 1726883089.27165: stdout chunk (state=3): >>> # destroy getpass # destroy pwd<<< 33192 1726883089.27183: stdout chunk (state=3): >>> # destroy termios # destroy errno <<< 33192 1726883089.27223: stdout chunk (state=3): >>># destroy json # destroy socket <<< 33192 1726883089.27255: stdout chunk (state=3): >>># destroy struct<<< 33192 1726883089.27269: stdout chunk (state=3): >>> # destroy glob <<< 33192 1726883089.27373: stdout chunk (state=3): >>># destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna <<< 33192 1726883089.27376: stdout chunk (state=3): >>># destroy stringprep<<< 33192 1726883089.27410: stdout chunk (state=3): >>> # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves<<< 33192 1726883089.27444: stdout chunk (state=3): >>> # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader<<< 33192 1726883089.27465: stdout chunk (state=3): >>> # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize <<< 33192 1726883089.27500: stdout chunk (state=3): >>># cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit<<< 33192 1726883089.27533: stdout chunk (state=3): >>> # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random<<< 33192 1726883089.27560: stdout chunk (state=3): >>> # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external<<< 33192 1726883089.27588: stdout chunk (state=3): >>> # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg<<< 33192 1726883089.27621: stdout chunk (state=3): >>> # cleanup[3] wiping re._parser # cleanup[3] wiping _sre<<< 33192 1726883089.27662: stdout chunk (state=3): >>> # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc<<< 33192 1726883089.27689: stdout chunk (state=3): >>> # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator<<< 33192 1726883089.27693: stdout chunk (state=3): >>> # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath<<< 33192 1726883089.27754: stdout chunk (state=3): >>> # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time<<< 33192 1726883089.27758: stdout chunk (state=3): >>> # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal<<< 33192 1726883089.27796: stdout chunk (state=3): >>> # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib<<< 33192 1726883089.27799: stdout chunk (state=3): >>> # cleanup[3] wiping sys # cleanup[3] wiping builtins<<< 33192 1726883089.27808: stdout chunk (state=3): >>> <<< 33192 1726883089.27842: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 33192 1726883089.28047: stdout chunk (state=3): >>># destroy sys.monitoring <<< 33192 1726883089.28100: stdout chunk (state=3): >>># destroy _socket # destroy _collections <<< 33192 1726883089.28154: stdout chunk (state=3): >>># destroy platform<<< 33192 1726883089.28157: stdout chunk (state=3): >>> # destroy _uuid # destroy stat<<< 33192 1726883089.28174: stdout chunk (state=3): >>> # destroy genericpath # destroy re._parser # destroy tokenize<<< 33192 1726883089.28232: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib <<< 33192 1726883089.28248: stdout chunk (state=3): >>># destroy copyreg <<< 33192 1726883089.28297: stdout chunk (state=3): >>># destroy contextlib # destroy _typing<<< 33192 1726883089.28331: stdout chunk (state=3): >>> # destroy _tokenize <<< 33192 1726883089.28372: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error <<< 33192 1726883089.28377: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external<<< 33192 1726883089.28401: stdout chunk (state=3): >>> # destroy _imp # destroy _io<<< 33192 1726883089.28451: stdout chunk (state=3): >>> # destroy marshal # clear sys.meta_path<<< 33192 1726883089.28475: stdout chunk (state=3): >>> # clear sys.modules # destroy _frozen_importlib <<< 33192 1726883089.28593: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs<<< 33192 1726883089.28623: stdout chunk (state=3): >>> # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit<<< 33192 1726883089.28656: stdout chunk (state=3): >>> # destroy _warnings # destroy math # destroy _bisect # destroy time<<< 33192 1726883089.28666: stdout chunk (state=3): >>> <<< 33192 1726883089.28762: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _operator<<< 33192 1726883089.28766: stdout chunk (state=3): >>> # destroy _sha2<<< 33192 1726883089.28785: stdout chunk (state=3): >>> # destroy _sre # destroy _string <<< 33192 1726883089.28823: stdout chunk (state=3): >>># destroy re # destroy itertools # destroy _abc <<< 33192 1726883089.28856: stdout chunk (state=3): >>># destroy posix # destroy _functools # destroy builtins<<< 33192 1726883089.28877: stdout chunk (state=3): >>> # destroy _thread # clear sys.audit hooks <<< 33192 1726883089.29596: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. <<< 33192 1726883089.29599: stdout chunk (state=3): >>><<< 33192 1726883089.29602: stderr chunk (state=3): >>><<< 33192 1726883089.29953: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cf2c530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cefbb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cf2eab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974ccfd160> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974ccfdfd0> import 'site' # Python 3.12.5 (main, Aug 7 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cd3be90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cd3bf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cd738c0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cd73f50> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cd53b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cd51280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cd39040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cd97800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cd96420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cd52150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cd94cb0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cdc8860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cd382c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974cdc8d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cdc8bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974cdc8f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cd36de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cdc9640> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cdc9310> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cdca4e0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cde0710> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974cde1df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cde2cc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974cde3320> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cde2240> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974cde3da0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cde34d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cdca540> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974cadfc80> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974cb087a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cb08500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974cb08710> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974cb08980> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cadde20> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cb0a090> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cb08d10> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cdcac30> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cb36420> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cb52510> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cb8b290> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cbb5a30> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cb8b3b0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cb531a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c9d8380> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cb51550> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974cb0afc0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f974c9d8560> # zipimport: found 103 names in '/tmp/ansible_setup_payload_s_7iimly/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974ca41f10> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974ca18e00> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c9dbf20> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974ca1bd70> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974ca718b0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974ca71640> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974ca70f50> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974ca716a0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974ca42930> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974ca72600> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974ca72840> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974ca72d80> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c8d8b00> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974c8da720> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c8db050> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c8dbf50> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c8dec00> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974c8def30> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c8dcdd0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c8e2c60> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c8e1760> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c8e14c0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c8e3f50> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c8dd3d0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974c926d80> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c926f00> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974c928ad0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c928890> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974c92b080> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c9291c0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c936870> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c92b200> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974c937b90> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974c937590> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974c937cb0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c927200> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974c93b320> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974c93c4a0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c939a90> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974c93ae10> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c939670> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974c7c4650> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c7c5400> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c93f9b0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c7c54f0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c7c7bc0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974c7ce0f0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974c7cea50> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c7c6c30> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974c7cd760> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c7cecc0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c85ee10> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c7dbc80> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c7d2c60> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c7d2a50> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c8659a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974bd483e0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974bd489b0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c84d460> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c84c770> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c8640b0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c8679e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974bd4b6e0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974bd4af90> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974bd4b170> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974bd4a3c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974bd4b800> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974bdb2300> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974bdb0320> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974c867bf0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974bdb2660> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974bdb3230> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974bdea6c0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974bdd63c0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974bc01ca0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974bc01880> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f974bc2e660> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974bc2ff50> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f974bc2be00> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-217", "ansible_nodename": "ip-10-31-10-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec21dae8c3a8315c7fcff8a700ae1140", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKNHHarzNQiKV9Fb8htkAo6V5gtUJbuBq7ufermmas6AagMSKqKyQaus7RRYNV0OV6WSVxouvjH4/8553bXF92vINMV37T3BVbSk0VjsDFFAEVkcy7KACT6upREthXzZwLKGK3O4ngGuc4tFf4pQ8aO6/f+Ohm4MzbhCTBhcqJAZAAAAFQClgsX0FPGUtboi3JLlgdUwEKs1QQAAAIBz7qRuyGTAbapZ14FtFLBd/Q0laoIT0Ng+sC/YShWSMBiBZRVJO3mNJQE7grw+G5/0xmxACjGd0+QZ+oyJeoMvQVHzKLhKNCQ5Qcli7GA0RhjCmFSxK8n8AMpfgdqAotUZ6ZM/CW7/H+Ep7tsT8jiMRjKnmn/+91PXtHzBqHvy7wAAAIBqn+Xsrfpj9UiHj75eG8gHsDD4pEVf0sY8iz5WBKk84gO63y8sEtJFcMk4z6d3sc8D+exGAETg/9GTzdTgIPSN1PiLTqVHEtlbgJ+im7iDKmVp6WGUg5p9gh8W0mmFQTtlZueefyvqpe89LjzuKwEioUAMWuj6jCnHVijuYPibng==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC1YAi1e55agg+XKOb96N2Hd6TUtxZ/7W67FkAKMTDd/JPwM9in1rbr68jzlzK4a0rCzng6JYcOJS1960MXsFkr9cKEEyRxrP+OcVVTCP1UBwwu+HeEtgzUGrkUqSozi+NM0AKc3uCoDmTWtndfQoQGBLd32f/hrMJsePHruozn79OIAbnq/odkEwUI1qi2n9hnLb1N5Fl3ftN+fbsO4xuY/yEGFk0z1aAAj7Vgd0BwnGBWIZ/SrGoijI6+YqSTBBu+/3QS+ArkKBr/GfRmxG4m4+VmBbzxjQ3VbpBtdydfkNIwD15OZRKS1cFilWjohPehP3UBvNNKlexDxvBeGPcdKQwz8VQOcbVxNj8TqQNkgfiOUDTqaKwGkLu5EbF+p40d+EpjceP/u40Mh56rEJaAMPWMkPROlGAqQt3naOhKJPg98dWS+w9gK+iW69TgJZtSqqlIoWdmJZQ0W/2R6Buf9ktgOHWYg+t5LZGP2Q6myRQWS/HxB6+hJ2WEB6pDObc=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIVCNaVFEWRPD6ZObUI3I47yORZdevoJeU4h657k6xFMv2EPlOCZq979bRxLfvVP++7xup0OeCRAJPwzE4wIsEg=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICX8RCP0XC2dyBTfIbAYFLUCYwTL55FaNzd8acASiOLe", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "44", "second": "49", "epoch": "1726883089", "epoch_int": "1726883089", "date": "2024-09-20", "time": "21:44:49", "iso8601_micro": "2024-09-21T01:44:49.239233Z", "iso8601": "2024-09-21T01:44:49Z", "iso8601_basic": "20240920T214449239233", "iso8601_basic_short": "20240920T214449", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 55312 10.31.10.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 55312 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 33192 1726883089.31679: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883088.5897825-33259-9361258256583/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 33192 1726883089.31793: _low_level_execute_command(): starting 33192 1726883089.31797: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883088.5897825-33259-9361258256583/ > /dev/null 2>&1 && sleep 0' 33192 1726883089.31813: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33192 1726883089.31833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33192 1726883089.31854: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 <<< 33192 1726883089.31865: stderr chunk (state=3): >>>debug2: match not found <<< 33192 1726883089.31880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33192 1726883089.31909: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33192 1726883089.32023: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 33192 1726883089.32039: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33192 1726883089.32055: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33192 1726883089.32157: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33192 1726883089.34418: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33192 1726883089.34500: stderr chunk (state=3): >>><<< 33192 1726883089.34503: stdout chunk (state=3): >>><<< 33192 1726883089.34528: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33192 1726883089.34537: handler run complete 33192 1726883089.34608: variable 'ansible_facts' from source: unknown 33192 1726883089.34843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883089.34894: variable 'ansible_facts' from source: unknown 33192 1726883089.34972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883089.35073: attempt loop complete, returning result 33192 1726883089.35080: _execute() done 33192 1726883089.35083: dumping result to json 33192 1726883089.35100: done dumping result, returning 33192 1726883089.35110: done running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [0affe814-3a2d-6c15-6a7e-000000000158] 33192 1726883089.35115: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000158 ok: [managed_node1] 33192 1726883089.35521: no more pending results, returning what we have 33192 1726883089.35525: results queue empty 33192 1726883089.35526: checking for any_errors_fatal 33192 1726883089.35528: done checking for any_errors_fatal 33192 1726883089.35529: checking for max_fail_percentage 33192 1726883089.35531: done checking for max_fail_percentage 33192 1726883089.35532: checking to see if all hosts have failed and the running result is not ok 33192 1726883089.35533: done checking to see if all hosts have failed 33192 1726883089.35739: getting the remaining hosts for this loop 33192 1726883089.35741: done getting the remaining hosts for this loop 33192 1726883089.35746: getting the next task for host managed_node1 33192 1726883089.35756: done getting next task for host managed_node1 33192 1726883089.35759: ^ task is: TASK: Check if system is ostree 33192 1726883089.35762: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883089.35766: getting variables 33192 1726883089.35767: in VariableManager get_vars() 33192 1726883089.35883: Calling all_inventory to load vars for managed_node1 33192 1726883089.35887: Calling groups_inventory to load vars for managed_node1 33192 1726883089.35891: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883089.35914: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000158 33192 1726883089.35918: WORKER PROCESS EXITING 33192 1726883089.35928: Calling all_plugins_play to load vars for managed_node1 33192 1726883089.35940: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883089.35946: Calling groups_plugins_play to load vars for managed_node1 33192 1726883089.36238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883089.36989: done with get_vars() 33192 1726883089.37001: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 21:44:49 -0400 (0:00:00.905) 0:00:02.772 ****** 33192 1726883089.37226: entering _queue_task() for managed_node1/stat 33192 1726883089.37879: worker is 1 (out of 1 available) 33192 1726883089.37896: exiting _queue_task() for managed_node1/stat 33192 1726883089.37919: done queuing things up, now waiting for results queue to drain 33192 1726883089.37921: waiting for pending results... 33192 1726883089.38199: running TaskExecutor() for managed_node1/TASK: Check if system is ostree 33192 1726883089.38328: in run() - task 0affe814-3a2d-6c15-6a7e-00000000015a 33192 1726883089.38355: variable 'ansible_search_path' from source: unknown 33192 1726883089.38363: variable 'ansible_search_path' from source: unknown 33192 1726883089.38409: calling self._execute() 33192 1726883089.38509: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883089.38526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883089.38547: variable 'omit' from source: magic vars 33192 1726883089.39141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 33192 1726883089.39486: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 33192 1726883089.39642: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 33192 1726883089.39645: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 33192 1726883089.39648: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 33192 1726883089.39742: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 33192 1726883089.39784: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 33192 1726883089.39821: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 33192 1726883089.39867: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 33192 1726883089.40117: Evaluated conditional (not __network_is_ostree is defined): True 33192 1726883089.40441: variable 'omit' from source: magic vars 33192 1726883089.40445: variable 'omit' from source: magic vars 33192 1726883089.40447: variable 'omit' from source: magic vars 33192 1726883089.40482: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33192 1726883089.40520: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33192 1726883089.40769: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33192 1726883089.40772: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33192 1726883089.40775: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33192 1726883089.40777: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33192 1726883089.40779: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883089.40782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883089.40955: Set connection var ansible_shell_type to sh 33192 1726883089.41106: Set connection var ansible_connection to ssh 33192 1726883089.41125: Set connection var ansible_timeout to 10 33192 1726883089.41138: Set connection var ansible_module_compression to ZIP_DEFLATED 33192 1726883089.41149: Set connection var ansible_pipelining to False 33192 1726883089.41159: Set connection var ansible_shell_executable to /bin/sh 33192 1726883089.41189: variable 'ansible_shell_executable' from source: unknown 33192 1726883089.41244: variable 'ansible_connection' from source: unknown 33192 1726883089.41253: variable 'ansible_module_compression' from source: unknown 33192 1726883089.41260: variable 'ansible_shell_type' from source: unknown 33192 1726883089.41266: variable 'ansible_shell_executable' from source: unknown 33192 1726883089.41273: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883089.41318: variable 'ansible_pipelining' from source: unknown 33192 1726883089.41326: variable 'ansible_timeout' from source: unknown 33192 1726883089.41336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883089.41741: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 33192 1726883089.41745: variable 'omit' from source: magic vars 33192 1726883089.41747: starting attempt loop 33192 1726883089.41749: running the handler 33192 1726883089.41767: _low_level_execute_command(): starting 33192 1726883089.41780: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 33192 1726883089.43240: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33192 1726883089.43256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33192 1726883089.43276: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found <<< 33192 1726883089.43388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33192 1726883089.43511: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33192 1726883089.43524: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33192 1726883089.43562: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33192 1726883089.43626: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33192 1726883089.45377: stdout chunk (state=3): >>>/root <<< 33192 1726883089.45480: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33192 1726883089.45833: stderr chunk (state=3): >>><<< 33192 1726883089.45839: stdout chunk (state=3): >>><<< 33192 1726883089.45843: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33192 1726883089.45852: _low_level_execute_command(): starting 33192 1726883089.45855: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883089.4572847-33297-24988556496233 `" && echo ansible-tmp-1726883089.4572847-33297-24988556496233="` echo /root/.ansible/tmp/ansible-tmp-1726883089.4572847-33297-24988556496233 `" ) && sleep 0' 33192 1726883089.47447: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 33192 1726883089.47453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found <<< 33192 1726883089.47456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 33192 1726883089.47459: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 <<< 33192 1726883089.47462: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33192 1726883089.47836: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 33192 1726883089.47966: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33192 1726883089.48032: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33192 1726883089.50086: stdout chunk (state=3): >>>ansible-tmp-1726883089.4572847-33297-24988556496233=/root/.ansible/tmp/ansible-tmp-1726883089.4572847-33297-24988556496233 <<< 33192 1726883089.50305: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33192 1726883089.50403: stderr chunk (state=3): >>><<< 33192 1726883089.50407: stdout chunk (state=3): >>><<< 33192 1726883089.50409: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883089.4572847-33297-24988556496233=/root/.ansible/tmp/ansible-tmp-1726883089.4572847-33297-24988556496233 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33192 1726883089.50641: variable 'ansible_module_compression' from source: unknown 33192 1726883089.50939: ANSIBALLZ: Using lock for stat 33192 1726883089.50943: ANSIBALLZ: Acquiring lock 33192 1726883089.50945: ANSIBALLZ: Lock acquired: 140092633062560 33192 1726883089.50947: ANSIBALLZ: Creating module 33192 1726883089.79571: ANSIBALLZ: Writing module into payload 33192 1726883089.79658: ANSIBALLZ: Writing module 33192 1726883089.79676: ANSIBALLZ: Renaming module 33192 1726883089.79684: ANSIBALLZ: Done creating module 33192 1726883089.79701: variable 'ansible_facts' from source: unknown 33192 1726883089.79750: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883089.4572847-33297-24988556496233/AnsiballZ_stat.py 33192 1726883089.79866: Sending initial data 33192 1726883089.79870: Sent initial data (152 bytes) 33192 1726883089.80327: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33192 1726883089.80331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33192 1726883089.80342: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 33192 1726883089.80348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33192 1726883089.80398: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33192 1726883089.80403: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33192 1726883089.80404: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33192 1726883089.80468: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 33192 1726883089.82535: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 33192 1726883089.82595: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 33192 1726883089.82658: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-33192zxvjc6ee/tmpg998jd92 /root/.ansible/tmp/ansible-tmp-1726883089.4572847-33297-24988556496233/AnsiballZ_stat.py <<< 33192 1726883089.82661: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883089.4572847-33297-24988556496233/AnsiballZ_stat.py" <<< 33192 1726883089.82722: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-33192zxvjc6ee/tmpg998jd92" to remote "/root/.ansible/tmp/ansible-tmp-1726883089.4572847-33297-24988556496233/AnsiballZ_stat.py" <<< 33192 1726883089.82726: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883089.4572847-33297-24988556496233/AnsiballZ_stat.py" <<< 33192 1726883089.83627: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33192 1726883089.83739: stderr chunk (state=3): >>><<< 33192 1726883089.83741: stdout chunk (state=3): >>><<< 33192 1726883089.83744: done transferring module to remote 33192 1726883089.83748: _low_level_execute_command(): starting 33192 1726883089.83750: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883089.4572847-33297-24988556496233/ /root/.ansible/tmp/ansible-tmp-1726883089.4572847-33297-24988556496233/AnsiballZ_stat.py && sleep 0' 33192 1726883089.84346: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 33192 1726883089.84364: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33192 1726883089.84384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33192 1726883089.84403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33192 1726883089.84422: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 <<< 33192 1726883089.84439: stderr chunk (state=3): >>>debug2: match not found <<< 33192 1726883089.84504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33192 1726883089.84551: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33192 1726883089.84586: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 33192 1726883089.84658: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 33192 1726883089.87471: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33192 1726883089.87485: stdout chunk (state=3): >>><<< 33192 1726883089.87502: stderr chunk (state=3): >>><<< 33192 1726883089.87526: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 33192 1726883089.87537: _low_level_execute_command(): starting 33192 1726883089.87548: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883089.4572847-33297-24988556496233/AnsiballZ_stat.py && sleep 0' 33192 1726883089.88143: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 33192 1726883089.88168: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33192 1726883089.88188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33192 1726883089.88304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 33192 1726883089.88328: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33192 1726883089.88440: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 33192 1726883089.91620: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 33192 1726883089.91690: stdout chunk (state=3): >>>import _imp # builtin <<< 33192 1726883089.91738: stdout chunk (state=3): >>>import '_thread' # <<< 33192 1726883089.91765: stdout chunk (state=3): >>>import '_warnings' # <<< 33192 1726883089.91786: stdout chunk (state=3): >>>import '_weakref' # <<< 33192 1726883089.91893: stdout chunk (state=3): >>>import '_io' # <<< 33192 1726883089.91900: stdout chunk (state=3): >>> <<< 33192 1726883089.91915: stdout chunk (state=3): >>>import 'marshal' # <<< 33192 1726883089.91985: stdout chunk (state=3): >>> import 'posix' # <<< 33192 1726883089.91997: stdout chunk (state=3): >>> <<< 33192 1726883089.92043: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 33192 1726883089.92053: stdout chunk (state=3): >>> <<< 33192 1726883089.92058: stdout chunk (state=3): >>># installing zipimport hook<<< 33192 1726883089.92089: stdout chunk (state=3): >>> <<< 33192 1726883089.92096: stdout chunk (state=3): >>>import 'time' # <<< 33192 1726883089.92121: stdout chunk (state=3): >>> import 'zipimport' # <<< 33192 1726883089.92207: stdout chunk (state=3): >>> # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py<<< 33192 1726883089.92215: stdout chunk (state=3): >>> <<< 33192 1726883089.92233: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc'<<< 33192 1726883089.92240: stdout chunk (state=3): >>> <<< 33192 1726883089.92268: stdout chunk (state=3): >>>import '_codecs' # <<< 33192 1726883089.92276: stdout chunk (state=3): >>> <<< 33192 1726883089.92318: stdout chunk (state=3): >>>import 'codecs' # <<< 33192 1726883089.92330: stdout chunk (state=3): >>> <<< 33192 1726883089.92415: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc'<<< 33192 1726883089.92419: stdout chunk (state=3): >>> <<< 33192 1726883089.92439: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c40c530><<< 33192 1726883089.92462: stdout chunk (state=3): >>> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c3dbb30><<< 33192 1726883089.92496: stdout chunk (state=3): >>> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py<<< 33192 1726883089.92520: stdout chunk (state=3): >>> <<< 33192 1726883089.92523: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc'<<< 33192 1726883089.92526: stdout chunk (state=3): >>> <<< 33192 1726883089.92571: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c40eab0> import '_signal' # <<< 33192 1726883089.92578: stdout chunk (state=3): >>> <<< 33192 1726883089.92606: stdout chunk (state=3): >>>import '_abc' # <<< 33192 1726883089.92634: stdout chunk (state=3): >>> import 'abc' # <<< 33192 1726883089.92666: stdout chunk (state=3): >>> import 'io' # <<< 33192 1726883089.92672: stdout chunk (state=3): >>> <<< 33192 1726883089.92711: stdout chunk (state=3): >>>import '_stat' # <<< 33192 1726883089.92737: stdout chunk (state=3): >>> import 'stat' # <<< 33192 1726883089.92883: stdout chunk (state=3): >>> import '_collections_abc' # <<< 33192 1726883089.92889: stdout chunk (state=3): >>> <<< 33192 1726883089.92928: stdout chunk (state=3): >>>import 'genericpath' # <<< 33192 1726883089.92953: stdout chunk (state=3): >>> <<< 33192 1726883089.92957: stdout chunk (state=3): >>>import 'posixpath' # <<< 33192 1726883089.92959: stdout chunk (state=3): >>> <<< 33192 1726883089.93005: stdout chunk (state=3): >>>import 'os' # <<< 33192 1726883089.93010: stdout chunk (state=3): >>> <<< 33192 1726883089.93039: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 33192 1726883089.93045: stdout chunk (state=3): >>> <<< 33192 1726883089.93069: stdout chunk (state=3): >>>Processing user site-packages<<< 33192 1726883089.93098: stdout chunk (state=3): >>> <<< 33192 1726883089.93101: stdout chunk (state=3): >>>Processing global site-packages<<< 33192 1726883089.93115: stdout chunk (state=3): >>> <<< 33192 1726883089.93118: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages'<<< 33192 1726883089.93124: stdout chunk (state=3): >>> <<< 33192 1726883089.93150: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages'<<< 33192 1726883089.93152: stdout chunk (state=3): >>> <<< 33192 1726883089.93176: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth'<<< 33192 1726883089.93186: stdout chunk (state=3): >>> <<< 33192 1726883089.93221: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc'<<< 33192 1726883089.93262: stdout chunk (state=3): >>> import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c1bd160> <<< 33192 1726883089.93349: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py<<< 33192 1726883089.93375: stdout chunk (state=3): >>> # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc'<<< 33192 1726883089.93387: stdout chunk (state=3): >>> <<< 33192 1726883089.93399: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c1bdfd0> <<< 33192 1726883089.93491: stdout chunk (state=3): >>>import 'site' # Python 3.12.5 (main, Aug 7 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux<<< 33192 1726883089.93494: stdout chunk (state=3): >>> <<< 33192 1726883089.93503: stdout chunk (state=3): >>>Type "help", "copyright", "credits" or "license" for more information.<<< 33192 1726883089.93636: stdout chunk (state=3): >>> <<< 33192 1726883089.93902: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 33192 1726883089.93943: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 33192 1726883089.93980: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 33192 1726883089.94009: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc'<<< 33192 1726883089.94014: stdout chunk (state=3): >>> <<< 33192 1726883089.94052: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 33192 1726883089.94124: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 33192 1726883089.94164: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 33192 1726883089.94210: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 33192 1726883089.94244: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c1fbe90><<< 33192 1726883089.94278: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py<<< 33192 1726883089.94288: stdout chunk (state=3): >>> <<< 33192 1726883089.94308: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc'<<< 33192 1726883089.94352: stdout chunk (state=3): >>> import '_operator' # <<< 33192 1726883089.94378: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c1fbf50><<< 33192 1726883089.94411: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py<<< 33192 1726883089.94417: stdout chunk (state=3): >>> <<< 33192 1726883089.94457: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc'<<< 33192 1726883089.94505: stdout chunk (state=3): >>> # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 33192 1726883089.94587: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 33192 1726883089.94649: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 33192 1726883089.94672: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c2338c0><<< 33192 1726883089.94704: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py<<< 33192 1726883089.94713: stdout chunk (state=3): >>> <<< 33192 1726883089.94737: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c233f50><<< 33192 1726883089.94769: stdout chunk (state=3): >>> import '_collections' # <<< 33192 1726883089.94853: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c213b60><<< 33192 1726883089.94875: stdout chunk (state=3): >>> import '_functools' # <<< 33192 1726883089.94928: stdout chunk (state=3): >>> import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c211280><<< 33192 1726883089.94934: stdout chunk (state=3): >>> <<< 33192 1726883089.95090: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c1f9040> <<< 33192 1726883089.95141: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py<<< 33192 1726883089.95175: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc'<<< 33192 1726883089.95178: stdout chunk (state=3): >>> <<< 33192 1726883089.95202: stdout chunk (state=3): >>>import '_sre' # <<< 33192 1726883089.95208: stdout chunk (state=3): >>> <<< 33192 1726883089.95278: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc'<<< 33192 1726883089.95282: stdout chunk (state=3): >>> <<< 33192 1726883089.95312: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py<<< 33192 1726883089.95319: stdout chunk (state=3): >>> <<< 33192 1726883089.95386: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c257800><<< 33192 1726883089.95396: stdout chunk (state=3): >>> <<< 33192 1726883089.95419: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c256420><<< 33192 1726883089.95425: stdout chunk (state=3): >>> <<< 33192 1726883089.95457: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py<<< 33192 1726883089.95478: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c212150><<< 33192 1726883089.95490: stdout chunk (state=3): >>> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c254cb0><<< 33192 1726883089.95572: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 33192 1726883089.95598: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' <<< 33192 1726883089.95617: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c288860> <<< 33192 1726883089.95640: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c1f82c0> <<< 33192 1726883089.95678: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py<<< 33192 1726883089.95686: stdout chunk (state=3): >>> <<< 33192 1726883089.95729: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 33192 1726883089.95757: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 33192 1726883089.95820: stdout chunk (state=3): >>>import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615c288d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c288bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 33192 1726883089.95847: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 33192 1726883089.95867: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615c288f80><<< 33192 1726883089.95881: stdout chunk (state=3): >>> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c1f6de0><<< 33192 1726883089.95927: stdout chunk (state=3): >>> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py<<< 33192 1726883089.95945: stdout chunk (state=3): >>> <<< 33192 1726883089.95952: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 33192 1726883089.96026: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc'<<< 33192 1726883089.96030: stdout chunk (state=3): >>> <<< 33192 1726883089.96058: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c289640> <<< 33192 1726883089.96085: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c289310> import 'importlib.machinery' # <<< 33192 1726883089.96130: stdout chunk (state=3): >>> # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py<<< 33192 1726883089.96147: stdout chunk (state=3): >>> <<< 33192 1726883089.96150: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 33192 1726883089.96179: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c28a4e0> <<< 33192 1726883089.96220: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 33192 1726883089.96261: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py<<< 33192 1726883089.96267: stdout chunk (state=3): >>> <<< 33192 1726883089.96315: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc'<<< 33192 1726883089.96358: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 33192 1726883089.96378: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc'<<< 33192 1726883089.96411: stdout chunk (state=3): >>> import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c2a0710> import 'errno' # <<< 33192 1726883089.96417: stdout chunk (state=3): >>> <<< 33192 1726883089.96449: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so'<<< 33192 1726883089.96474: stdout chunk (state=3): >>> # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so'<<< 33192 1726883089.96510: stdout chunk (state=3): >>> import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615c2a1df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py<<< 33192 1726883089.96516: stdout chunk (state=3): >>> <<< 33192 1726883089.96538: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc'<<< 33192 1726883089.96574: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 33192 1726883089.96614: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c2a2cc0> <<< 33192 1726883089.96668: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so'<<< 33192 1726883089.96694: stdout chunk (state=3): >>> # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615c2a3320> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c2a2240><<< 33192 1726883089.96728: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 33192 1726883089.96760: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc'<<< 33192 1726883089.96809: stdout chunk (state=3): >>> # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so'<<< 33192 1726883089.96817: stdout chunk (state=3): >>> <<< 33192 1726883089.96850: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 33192 1726883089.96853: stdout chunk (state=3): >>>import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615c2a3da0> <<< 33192 1726883089.96942: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c2a34d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c28a540><<< 33192 1726883089.96945: stdout chunk (state=3): >>> <<< 33192 1726883089.96982: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py<<< 33192 1726883089.96985: stdout chunk (state=3): >>> <<< 33192 1726883089.97032: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc'<<< 33192 1726883089.97037: stdout chunk (state=3): >>> <<< 33192 1726883089.97061: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py<<< 33192 1726883089.97105: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc'<<< 33192 1726883089.97110: stdout chunk (state=3): >>> <<< 33192 1726883089.97159: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so'<<< 33192 1726883089.97164: stdout chunk (state=3): >>> <<< 33192 1726883089.97205: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615c03fc80> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py<<< 33192 1726883089.97220: stdout chunk (state=3): >>> <<< 33192 1726883089.97228: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 33192 1726883089.97283: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615c0687a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c068500><<< 33192 1726883089.97317: stdout chunk (state=3): >>> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so'<<< 33192 1726883089.97325: stdout chunk (state=3): >>> <<< 33192 1726883089.97337: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615c068710><<< 33192 1726883089.97376: stdout chunk (state=3): >>> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' <<< 33192 1726883089.97395: stdout chunk (state=3): >>># extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so'<<< 33192 1726883089.97430: stdout chunk (state=3): >>> import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615c068980> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c03de20><<< 33192 1726883089.97441: stdout chunk (state=3): >>> <<< 33192 1726883089.97464: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py<<< 33192 1726883089.97632: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 33192 1726883089.97674: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 33192 1726883089.97701: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 33192 1726883089.97727: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c06a090><<< 33192 1726883089.97733: stdout chunk (state=3): >>> <<< 33192 1726883089.97777: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c068d10> <<< 33192 1726883089.97814: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c28ac30> <<< 33192 1726883089.97864: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 33192 1726883089.97950: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 33192 1726883089.97989: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py<<< 33192 1726883089.98138: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c096420> <<< 33192 1726883089.98164: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 33192 1726883089.98201: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc'<<< 33192 1726883089.98236: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py<<< 33192 1726883089.98243: stdout chunk (state=3): >>> <<< 33192 1726883089.98273: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc'<<< 33192 1726883089.98360: stdout chunk (state=3): >>> import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c0b2510> <<< 33192 1726883089.98400: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 33192 1726883089.98473: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 33192 1726883089.98576: stdout chunk (state=3): >>>import 'ntpath' # <<< 33192 1726883089.98615: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py<<< 33192 1726883089.98659: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c0eb290> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py<<< 33192 1726883089.98664: stdout chunk (state=3): >>> <<< 33192 1726883089.98718: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc'<<< 33192 1726883089.98762: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 33192 1726883089.98837: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc'<<< 33192 1726883089.98933: stdout chunk (state=3): >>> <<< 33192 1726883089.98996: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c115a30> <<< 33192 1726883089.99121: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c0eb3b0><<< 33192 1726883089.99127: stdout chunk (state=3): >>> <<< 33192 1726883089.99196: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c0b31a0> <<< 33192 1726883089.99241: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 33192 1726883089.99261: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' <<< 33192 1726883089.99285: stdout chunk (state=3): >>>import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615bf2c380> <<< 33192 1726883089.99317: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c0b1550><<< 33192 1726883089.99480: stdout chunk (state=3): >>> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c06afc0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc'<<< 33192 1726883089.99488: stdout chunk (state=3): >>> <<< 33192 1726883089.99524: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f615bf2c560> <<< 33192 1726883089.99658: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_20bam6me/ansible_stat_payload.zip' <<< 33192 1726883089.99685: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883089.99956: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883090.00005: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 33192 1726883090.00041: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 33192 1726883090.00105: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 33192 1726883090.00237: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 33192 1726883090.00282: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py<<< 33192 1726883090.00305: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615bf81f40> import '_typing' # <<< 33192 1726883090.00436: stdout chunk (state=3): >>> <<< 33192 1726883090.00631: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615bf58e60><<< 33192 1726883090.00641: stdout chunk (state=3): >>> <<< 33192 1726883090.00654: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615bf2ffb0><<< 33192 1726883090.00673: stdout chunk (state=3): >>> <<< 33192 1726883090.00676: stdout chunk (state=3): >>># zipimport: zlib available<<< 33192 1726883090.00695: stdout chunk (state=3): >>> <<< 33192 1726883090.00724: stdout chunk (state=3): >>>import 'ansible' # <<< 33192 1726883090.00730: stdout chunk (state=3): >>> <<< 33192 1726883090.00756: stdout chunk (state=3): >>># zipimport: zlib available<<< 33192 1726883090.00762: stdout chunk (state=3): >>> <<< 33192 1726883090.00795: stdout chunk (state=3): >>># zipimport: zlib available<<< 33192 1726883090.00806: stdout chunk (state=3): >>> <<< 33192 1726883090.00822: stdout chunk (state=3): >>># zipimport: zlib available<<< 33192 1726883090.00830: stdout chunk (state=3): >>> <<< 33192 1726883090.00856: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 33192 1726883090.00862: stdout chunk (state=3): >>> <<< 33192 1726883090.00885: stdout chunk (state=3): >>># zipimport: zlib available<<< 33192 1726883090.00942: stdout chunk (state=3): >>> <<< 33192 1726883090.03393: stdout chunk (state=3): >>># zipimport: zlib available<<< 33192 1726883090.03395: stdout chunk (state=3): >>> <<< 33192 1726883090.05550: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py<<< 33192 1726883090.05604: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615bf5bdd0><<< 33192 1726883090.05607: stdout chunk (state=3): >>> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 33192 1726883090.05664: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py<<< 33192 1726883090.05688: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 33192 1726883090.05720: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 33192 1726883090.05828: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615bfa9940> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615bfa96d0><<< 33192 1726883090.05850: stdout chunk (state=3): >>> <<< 33192 1726883090.05883: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615bfa8fe0> <<< 33192 1726883090.05924: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py<<< 33192 1726883090.05944: stdout chunk (state=3): >>> <<< 33192 1726883090.06019: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615bfa9730> <<< 33192 1726883090.06042: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615bf82bd0> <<< 33192 1726883090.06083: stdout chunk (state=3): >>>import 'atexit' # <<< 33192 1726883090.06119: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so'<<< 33192 1726883090.06122: stdout chunk (state=3): >>> # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so'<<< 33192 1726883090.06187: stdout chunk (state=3): >>> import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615bfaa690> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so'<<< 33192 1726883090.06191: stdout chunk (state=3): >>> import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615bfaa8d0><<< 33192 1726883090.06224: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py<<< 33192 1726883090.06248: stdout chunk (state=3): >>> <<< 33192 1726883090.06304: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc'<<< 33192 1726883090.06340: stdout chunk (state=3): >>> import '_locale' # <<< 33192 1726883090.06424: stdout chunk (state=3): >>> import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615bfaae10><<< 33192 1726883090.06428: stdout chunk (state=3): >>> <<< 33192 1726883090.06477: stdout chunk (state=3): >>>import 'pwd' # <<< 33192 1726883090.06498: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 33192 1726883090.06532: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc'<<< 33192 1726883090.06551: stdout chunk (state=3): >>> <<< 33192 1726883090.06593: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615be0cbc0><<< 33192 1726883090.06656: stdout chunk (state=3): >>> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 33192 1726883090.06661: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so'<<< 33192 1726883090.06688: stdout chunk (state=3): >>> <<< 33192 1726883090.06719: stdout chunk (state=3): >>>import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615be0e7b0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 33192 1726883090.06743: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 33192 1726883090.06794: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615be0f170><<< 33192 1726883090.06837: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 33192 1726883090.06907: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615be10350> <<< 33192 1726883090.07030: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py<<< 33192 1726883090.07076: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 33192 1726883090.07210: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615be12de0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so'<<< 33192 1726883090.07225: stdout chunk (state=3): >>> <<< 33192 1726883090.07253: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615be13110><<< 33192 1726883090.07289: stdout chunk (state=3): >>> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615be110a0><<< 33192 1726883090.07304: stdout chunk (state=3): >>> <<< 33192 1726883090.07323: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py<<< 33192 1726883090.07379: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 33192 1726883090.07412: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 33192 1726883090.07450: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc'<<< 33192 1726883090.07476: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 33192 1726883090.07523: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc'<<< 33192 1726883090.07571: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py<<< 33192 1726883090.07598: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615be16e10><<< 33192 1726883090.07636: stdout chunk (state=3): >>> import '_tokenize' # <<< 33192 1726883090.07738: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615be15910><<< 33192 1726883090.07770: stdout chunk (state=3): >>> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615be15670> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 33192 1726883090.07921: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615be17dd0> <<< 33192 1726883090.07973: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615be115b0> <<< 33192 1726883090.08028: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 33192 1726883090.08069: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 33192 1726883090.08087: stdout chunk (state=3): >>>import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615be5eff0> <<< 33192 1726883090.08140: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc'<<< 33192 1726883090.08144: stdout chunk (state=3): >>> <<< 33192 1726883090.08191: stdout chunk (state=3): >>>import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615be5f140> <<< 33192 1726883090.08206: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py<<< 33192 1726883090.08226: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 33192 1726883090.08263: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py<<< 33192 1726883090.08337: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so'<<< 33192 1726883090.08341: stdout chunk (state=3): >>> <<< 33192 1726883090.08369: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615be60d40> <<< 33192 1726883090.08408: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615be60ad0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 33192 1726883090.08694: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615be63230><<< 33192 1726883090.08720: stdout chunk (state=3): >>> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615be61370> <<< 33192 1726883090.08840: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc'<<< 33192 1726883090.08857: stdout chunk (state=3): >>> <<< 33192 1726883090.08909: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 33192 1726883090.08940: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 33192 1726883090.08954: stdout chunk (state=3): >>>import '_string' # <<< 33192 1726883090.09023: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615be6a9f0><<< 33192 1726883090.09148: stdout chunk (state=3): >>> <<< 33192 1726883090.09313: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615be63380><<< 33192 1726883090.09318: stdout chunk (state=3): >>> <<< 33192 1726883090.09439: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so'<<< 33192 1726883090.09445: stdout chunk (state=3): >>> # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so'<<< 33192 1726883090.09471: stdout chunk (state=3): >>> import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615be6b530> <<< 33192 1726883090.09504: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so'<<< 33192 1726883090.09552: stdout chunk (state=3): >>> # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615be6bbc0><<< 33192 1726883090.09564: stdout chunk (state=3): >>> <<< 33192 1726883090.09632: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so'<<< 33192 1726883090.09650: stdout chunk (state=3): >>> # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 33192 1726883090.09695: stdout chunk (state=3): >>>import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615be6bb00> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615be5f440><<< 33192 1726883090.09739: stdout chunk (state=3): >>> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc'<<< 33192 1726883090.09755: stdout chunk (state=3): >>> <<< 33192 1726883090.09828: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc'<<< 33192 1726883090.09848: stdout chunk (state=3): >>> <<< 33192 1726883090.09869: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 33192 1726883090.09912: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so'<<< 33192 1726883090.10039: stdout chunk (state=3): >>> import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615be6f440> <<< 33192 1726883090.10261: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so'<<< 33192 1726883090.10273: stdout chunk (state=3): >>> <<< 33192 1726883090.10299: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so'<<< 33192 1726883090.10303: stdout chunk (state=3): >>> import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615be70500><<< 33192 1726883090.10320: stdout chunk (state=3): >>> <<< 33192 1726883090.10340: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615be6dbb0><<< 33192 1726883090.10348: stdout chunk (state=3): >>> <<< 33192 1726883090.10396: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 33192 1726883090.10411: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so'<<< 33192 1726883090.10422: stdout chunk (state=3): >>> <<< 33192 1726883090.10432: stdout chunk (state=3): >>>import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615be6ef30> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615be6d790> <<< 33192 1726883090.10462: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883090.10503: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 33192 1726883090.10538: stdout chunk (state=3): >>> # zipimport: zlib available<<< 33192 1726883090.10545: stdout chunk (state=3): >>> <<< 33192 1726883090.10833: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883090.10884: stdout chunk (state=3): >>># zipimport: zlib available<<< 33192 1726883090.10890: stdout chunk (state=3): >>> <<< 33192 1726883090.10970: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883090.10995: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 33192 1726883090.11002: stdout chunk (state=3): >>> <<< 33192 1726883090.11037: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883090.11066: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883090.11089: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # <<< 33192 1726883090.11122: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883090.11377: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883090.11833: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883090.12845: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883090.13887: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 33192 1726883090.13900: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 33192 1726883090.13942: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 33192 1726883090.13968: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 33192 1726883090.14011: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615bef4650> <<< 33192 1726883090.14118: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 33192 1726883090.14158: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615bef5430> <<< 33192 1726883090.14181: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615be72ed0> <<< 33192 1726883090.14224: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 33192 1726883090.14227: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883090.14281: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # <<< 33192 1726883090.14307: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883090.14458: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883090.14760: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 33192 1726883090.14779: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615bef52b0> <<< 33192 1726883090.14797: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883090.15744: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883090.16720: stdout chunk (state=3): >>># zipimport: zlib available<<< 33192 1726883090.16726: stdout chunk (state=3): >>> <<< 33192 1726883090.16874: stdout chunk (state=3): >>># zipimport: zlib available<<< 33192 1726883090.16882: stdout chunk (state=3): >>> <<< 33192 1726883090.17016: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 33192 1726883090.17047: stdout chunk (state=3): >>> # zipimport: zlib available <<< 33192 1726883090.17117: stdout chunk (state=3): >>># zipimport: zlib available<<< 33192 1726883090.17194: stdout chunk (state=3): >>> <<< 33192 1726883090.17197: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 33192 1726883090.17238: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883090.17325: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883090.17451: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available <<< 33192 1726883090.17487: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 33192 1726883090.17491: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883090.17751: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883090.17754: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 33192 1726883090.17775: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883090.17891: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883090.18166: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 33192 1726883090.18232: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 33192 1726883090.18263: stdout chunk (state=3): >>>import '_ast' # <<< 33192 1726883090.18340: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615bef7e60> <<< 33192 1726883090.18361: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883090.18470: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883090.18547: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # <<< 33192 1726883090.18585: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 33192 1726883090.18589: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 33192 1726883090.18663: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 33192 1726883090.18788: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615bf020f0> <<< 33192 1726883090.18858: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 33192 1726883090.18862: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615bf029f0> <<< 33192 1726883090.18894: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615bef6c60> # zipimport: zlib available <<< 33192 1726883090.18915: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883090.18966: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 33192 1726883090.19024: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883090.19068: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883090.19132: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883090.19205: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 33192 1726883090.19258: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 33192 1726883090.19352: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615bf01700> <<< 33192 1726883090.19387: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615bf02bd0> <<< 33192 1726883090.19433: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 33192 1726883090.19507: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883090.19590: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883090.19604: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883090.19679: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 33192 1726883090.19685: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 33192 1726883090.19726: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 33192 1726883090.19729: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 33192 1726883090.19792: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 33192 1726883090.19814: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 33192 1726883090.19834: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 33192 1726883090.19884: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615bd92cf0> <<< 33192 1726883090.19929: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615bd0fb60> <<< 33192 1726883090.20050: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615bf06ab0> <<< 33192 1726883090.20055: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615bf06900> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 33192 1726883090.20107: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 33192 1726883090.20173: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available <<< 33192 1726883090.20205: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 33192 1726883090.20209: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883090.20652: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883090.20736: stdout chunk (state=3): >>># zipimport: zlib available <<< 33192 1726883090.20905: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 33192 1726883090.20922: stdout chunk (state=3): >>># destroy __main__ <<< 33192 1726883090.21452: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 33192 1726883090.21480: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins <<< 33192 1726883090.21521: stdout chunk (state=3): >>># cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre <<< 33192 1726883090.21559: stdout chunk (state=3): >>># cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib <<< 33192 1726883090.21625: stdout chunk (state=3): >>># cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal <<< 33192 1726883090.21642: stdout chunk (state=3): >>># cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 33192 1726883090.22000: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 33192 1726883090.22006: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 33192 1726883090.22048: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 33192 1726883090.22065: stdout chunk (state=3): >>># destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch <<< 33192 1726883090.22072: stdout chunk (state=3): >>># destroy ipaddress <<< 33192 1726883090.22125: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport <<< 33192 1726883090.22137: stdout chunk (state=3): >>># destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib <<< 33192 1726883090.22150: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 33192 1726883090.22177: stdout chunk (state=3): >>># destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select <<< 33192 1726883090.22183: stdout chunk (state=3): >>># destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 33192 1726883090.22253: stdout chunk (state=3): >>># destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib <<< 33192 1726883090.22256: stdout chunk (state=3): >>># destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 33192 1726883090.22315: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian <<< 33192 1726883090.22349: stdout chunk (state=3): >>># cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader <<< 33192 1726883090.22356: stdout chunk (state=3): >>># cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 <<< 33192 1726883090.22396: stdout chunk (state=3): >>># cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings <<< 33192 1726883090.22401: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc<<< 33192 1726883090.22426: stdout chunk (state=3): >>> # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 33192 1726883090.22430: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs <<< 33192 1726883090.22449: stdout chunk (state=3): >>># cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 33192 1726883090.22455: stdout chunk (state=3): >>># cleanup[3] wiping marshal <<< 33192 1726883090.22482: stdout chunk (state=3): >>># cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys<<< 33192 1726883090.22498: stdout chunk (state=3): >>> # cleanup[3] wiping builtins<<< 33192 1726883090.22511: stdout chunk (state=3): >>> # destroy selinux._selinux # destroy systemd._daemon<<< 33192 1726883090.22550: stdout chunk (state=3): >>> # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 33192 1726883090.22730: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket<<< 33192 1726883090.22762: stdout chunk (state=3): >>> # destroy _collections<<< 33192 1726883090.22766: stdout chunk (state=3): >>> <<< 33192 1726883090.22796: stdout chunk (state=3): >>># destroy platform <<< 33192 1726883090.22827: stdout chunk (state=3): >>># destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize<<< 33192 1726883090.22860: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib<<< 33192 1726883090.22866: stdout chunk (state=3): >>> <<< 33192 1726883090.22915: stdout chunk (state=3): >>># destroy copyreg # destroy contextlib # destroy _typing<<< 33192 1726883090.22927: stdout chunk (state=3): >>> <<< 33192 1726883090.22948: stdout chunk (state=3): >>># destroy _tokenize <<< 33192 1726883090.22954: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response<<< 33192 1726883090.22982: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external<<< 33192 1726883090.23036: stdout chunk (state=3): >>> # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 33192 1726883090.23131: stdout chunk (state=3): >>># destroy codecs<<< 33192 1726883090.23140: stdout chunk (state=3): >>> <<< 33192 1726883090.23164: stdout chunk (state=3): >>># destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs<<< 33192 1726883090.23186: stdout chunk (state=3): >>> # destroy io # destroy traceback # destroy warnings # destroy weakref<<< 33192 1726883090.23204: stdout chunk (state=3): >>> # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 33192 1726883090.23267: stdout chunk (state=3): >>># destroy time # destroy _random <<< 33192 1726883090.23276: stdout chunk (state=3): >>># destroy _weakref<<< 33192 1726883090.23314: stdout chunk (state=3): >>> # destroy _operator<<< 33192 1726883090.23331: stdout chunk (state=3): >>> <<< 33192 1726883090.23340: stdout chunk (state=3): >>># destroy _sha2 <<< 33192 1726883090.23384: stdout chunk (state=3): >>># destroy _string # destroy re # destroy itertools <<< 33192 1726883090.23397: stdout chunk (state=3): >>># destroy _abc # destroy _sre # destroy posix<<< 33192 1726883090.23420: stdout chunk (state=3): >>> # destroy _functools # destroy builtins # destroy _thread<<< 33192 1726883090.23443: stdout chunk (state=3): >>> # clear sys.audit hooks<<< 33192 1726883090.23472: stdout chunk (state=3): >>> <<< 33192 1726883090.23964: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. <<< 33192 1726883090.23967: stdout chunk (state=3): >>><<< 33192 1726883090.23970: stderr chunk (state=3): >>><<< 33192 1726883090.24075: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c40c530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c3dbb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c40eab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c1bd160> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c1bdfd0> import 'site' # Python 3.12.5 (main, Aug 7 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c1fbe90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c1fbf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c2338c0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c233f50> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c213b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c211280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c1f9040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c257800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c256420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c212150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c254cb0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c288860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c1f82c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615c288d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c288bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615c288f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c1f6de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c289640> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c289310> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c28a4e0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c2a0710> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615c2a1df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c2a2cc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615c2a3320> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c2a2240> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615c2a3da0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c2a34d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c28a540> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615c03fc80> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615c0687a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c068500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615c068710> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615c068980> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c03de20> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c06a090> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c068d10> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c28ac30> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c096420> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c0b2510> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c0eb290> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c115a30> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c0eb3b0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c0b31a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615bf2c380> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c0b1550> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615c06afc0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f615bf2c560> # zipimport: found 30 names in '/tmp/ansible_stat_payload_20bam6me/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615bf81f40> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615bf58e60> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615bf2ffb0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615bf5bdd0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615bfa9940> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615bfa96d0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615bfa8fe0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615bfa9730> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615bf82bd0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615bfaa690> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615bfaa8d0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615bfaae10> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615be0cbc0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615be0e7b0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615be0f170> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615be10350> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615be12de0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615be13110> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615be110a0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615be16e10> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615be15910> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615be15670> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615be17dd0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615be115b0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615be5eff0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615be5f140> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615be60d40> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615be60ad0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615be63230> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615be61370> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615be6a9f0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615be63380> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615be6b530> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615be6bbc0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615be6bb00> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615be5f440> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615be6f440> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615be70500> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615be6dbb0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615be6ef30> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615be6d790> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615bef4650> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615bef5430> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615be72ed0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615bef52b0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615bef7e60> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615bf020f0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615bf029f0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615bef6c60> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f615bf01700> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615bf02bd0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615bd92cf0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615bd0fb60> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615bf06ab0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f615bf06900> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.10.217 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 33192 1726883090.24785: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883089.4572847-33297-24988556496233/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 33192 1726883090.24789: _low_level_execute_command(): starting 33192 1726883090.24792: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883089.4572847-33297-24988556496233/ > /dev/null 2>&1 && sleep 0' 33192 1726883090.24881: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33192 1726883090.24892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33192 1726883090.24896: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration <<< 33192 1726883090.24908: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33192 1726883090.24954: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33192 1726883090.24957: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 33192 1726883090.25024: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 33192 1726883090.27691: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33192 1726883090.27695: stdout chunk (state=3): >>><<< 33192 1726883090.27697: stderr chunk (state=3): >>><<< 33192 1726883090.27713: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.217 originally 10.31.10.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 33192 1726883090.27725: handler run complete 33192 1726883090.27761: attempt loop complete, returning result 33192 1726883090.27805: _execute() done 33192 1726883090.27809: dumping result to json 33192 1726883090.27811: done dumping result, returning 33192 1726883090.27814: done running TaskExecutor() for managed_node1/TASK: Check if system is ostree [0affe814-3a2d-6c15-6a7e-00000000015a] 33192 1726883090.27822: sending task result for task 0affe814-3a2d-6c15-6a7e-00000000015a 33192 1726883090.28201: done sending task result for task 0affe814-3a2d-6c15-6a7e-00000000015a 33192 1726883090.28205: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 33192 1726883090.28300: no more pending results, returning what we have 33192 1726883090.28304: results queue empty 33192 1726883090.28305: checking for any_errors_fatal 33192 1726883090.28327: done checking for any_errors_fatal 33192 1726883090.28328: checking for max_fail_percentage 33192 1726883090.28331: done checking for max_fail_percentage 33192 1726883090.28332: checking to see if all hosts have failed and the running result is not ok 33192 1726883090.28333: done checking to see if all hosts have failed 33192 1726883090.28335: getting the remaining hosts for this loop 33192 1726883090.28338: done getting the remaining hosts for this loop 33192 1726883090.28344: getting the next task for host managed_node1 33192 1726883090.28352: done getting next task for host managed_node1 33192 1726883090.28355: ^ task is: TASK: Set flag to indicate system is ostree 33192 1726883090.28359: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883090.28363: getting variables 33192 1726883090.28365: in VariableManager get_vars() 33192 1726883090.28405: Calling all_inventory to load vars for managed_node1 33192 1726883090.28409: Calling groups_inventory to load vars for managed_node1 33192 1726883090.28414: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883090.28549: Calling all_plugins_play to load vars for managed_node1 33192 1726883090.28555: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883090.28562: Calling groups_plugins_play to load vars for managed_node1 33192 1726883090.28990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883090.29343: done with get_vars() 33192 1726883090.29355: done getting variables 33192 1726883090.29483: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 21:44:50 -0400 (0:00:00.922) 0:00:03.695 ****** 33192 1726883090.29515: entering _queue_task() for managed_node1/set_fact 33192 1726883090.29517: Creating lock for set_fact 33192 1726883090.29897: worker is 1 (out of 1 available) 33192 1726883090.29909: exiting _queue_task() for managed_node1/set_fact 33192 1726883090.29923: done queuing things up, now waiting for results queue to drain 33192 1726883090.29924: waiting for pending results... 33192 1726883090.30253: running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree 33192 1726883090.30330: in run() - task 0affe814-3a2d-6c15-6a7e-00000000015b 33192 1726883090.30352: variable 'ansible_search_path' from source: unknown 33192 1726883090.30362: variable 'ansible_search_path' from source: unknown 33192 1726883090.30482: calling self._execute() 33192 1726883090.30532: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883090.30550: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883090.30567: variable 'omit' from source: magic vars 33192 1726883090.31253: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 33192 1726883090.31617: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 33192 1726883090.31687: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 33192 1726883090.31753: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 33192 1726883090.31830: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 33192 1726883090.31939: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 33192 1726883090.32033: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 33192 1726883090.32042: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 33192 1726883090.32076: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 33192 1726883090.32229: Evaluated conditional (not __network_is_ostree is defined): True 33192 1726883090.32244: variable 'omit' from source: magic vars 33192 1726883090.32315: variable 'omit' from source: magic vars 33192 1726883090.32540: variable '__ostree_booted_stat' from source: set_fact 33192 1726883090.32559: variable 'omit' from source: magic vars 33192 1726883090.32610: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33192 1726883090.32648: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33192 1726883090.32676: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33192 1726883090.32718: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33192 1726883090.32735: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33192 1726883090.32797: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33192 1726883090.32806: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883090.32811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883090.32950: Set connection var ansible_shell_type to sh 33192 1726883090.33015: Set connection var ansible_connection to ssh 33192 1726883090.33018: Set connection var ansible_timeout to 10 33192 1726883090.33027: Set connection var ansible_module_compression to ZIP_DEFLATED 33192 1726883090.33029: Set connection var ansible_pipelining to False 33192 1726883090.33039: Set connection var ansible_shell_executable to /bin/sh 33192 1726883090.33064: variable 'ansible_shell_executable' from source: unknown 33192 1726883090.33075: variable 'ansible_connection' from source: unknown 33192 1726883090.33083: variable 'ansible_module_compression' from source: unknown 33192 1726883090.33090: variable 'ansible_shell_type' from source: unknown 33192 1726883090.33097: variable 'ansible_shell_executable' from source: unknown 33192 1726883090.33104: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883090.33112: variable 'ansible_pipelining' from source: unknown 33192 1726883090.33134: variable 'ansible_timeout' from source: unknown 33192 1726883090.33136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883090.33343: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 33192 1726883090.33352: variable 'omit' from source: magic vars 33192 1726883090.33366: starting attempt loop 33192 1726883090.33379: running the handler 33192 1726883090.33439: handler run complete 33192 1726883090.33442: attempt loop complete, returning result 33192 1726883090.33444: _execute() done 33192 1726883090.33448: dumping result to json 33192 1726883090.33451: done dumping result, returning 33192 1726883090.33453: done running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree [0affe814-3a2d-6c15-6a7e-00000000015b] 33192 1726883090.33455: sending task result for task 0affe814-3a2d-6c15-6a7e-00000000015b 33192 1726883090.33646: done sending task result for task 0affe814-3a2d-6c15-6a7e-00000000015b 33192 1726883090.33650: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 33192 1726883090.33783: no more pending results, returning what we have 33192 1726883090.33787: results queue empty 33192 1726883090.33789: checking for any_errors_fatal 33192 1726883090.33796: done checking for any_errors_fatal 33192 1726883090.33797: checking for max_fail_percentage 33192 1726883090.33799: done checking for max_fail_percentage 33192 1726883090.33800: checking to see if all hosts have failed and the running result is not ok 33192 1726883090.33801: done checking to see if all hosts have failed 33192 1726883090.33802: getting the remaining hosts for this loop 33192 1726883090.33804: done getting the remaining hosts for this loop 33192 1726883090.33809: getting the next task for host managed_node1 33192 1726883090.33820: done getting next task for host managed_node1 33192 1726883090.33823: ^ task is: TASK: Fix CentOS6 Base repo 33192 1726883090.33826: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883090.33831: getting variables 33192 1726883090.33833: in VariableManager get_vars() 33192 1726883090.34011: Calling all_inventory to load vars for managed_node1 33192 1726883090.34015: Calling groups_inventory to load vars for managed_node1 33192 1726883090.34019: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883090.34030: Calling all_plugins_play to load vars for managed_node1 33192 1726883090.34035: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883090.34045: Calling groups_plugins_play to load vars for managed_node1 33192 1726883090.34330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883090.34695: done with get_vars() 33192 1726883090.34706: done getting variables 33192 1726883090.34861: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 21:44:50 -0400 (0:00:00.053) 0:00:03.749 ****** 33192 1726883090.34902: entering _queue_task() for managed_node1/copy 33192 1726883090.35186: worker is 1 (out of 1 available) 33192 1726883090.35199: exiting _queue_task() for managed_node1/copy 33192 1726883090.35211: done queuing things up, now waiting for results queue to drain 33192 1726883090.35212: waiting for pending results... 33192 1726883090.35584: running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo 33192 1726883090.35625: in run() - task 0affe814-3a2d-6c15-6a7e-00000000015d 33192 1726883090.35682: variable 'ansible_search_path' from source: unknown 33192 1726883090.35685: variable 'ansible_search_path' from source: unknown 33192 1726883090.35711: calling self._execute() 33192 1726883090.35808: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883090.35891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883090.35902: variable 'omit' from source: magic vars 33192 1726883090.36545: variable 'ansible_distribution' from source: facts 33192 1726883090.36583: Evaluated conditional (ansible_distribution == 'CentOS'): False 33192 1726883090.36593: when evaluation is False, skipping this task 33192 1726883090.36602: _execute() done 33192 1726883090.36611: dumping result to json 33192 1726883090.36620: done dumping result, returning 33192 1726883090.36632: done running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo [0affe814-3a2d-6c15-6a7e-00000000015d] 33192 1726883090.36651: sending task result for task 0affe814-3a2d-6c15-6a7e-00000000015d skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution == 'CentOS'", "skip_reason": "Conditional result was False" } 33192 1726883090.36952: no more pending results, returning what we have 33192 1726883090.36957: results queue empty 33192 1726883090.36958: checking for any_errors_fatal 33192 1726883090.36963: done checking for any_errors_fatal 33192 1726883090.36964: checking for max_fail_percentage 33192 1726883090.36966: done checking for max_fail_percentage 33192 1726883090.36967: checking to see if all hosts have failed and the running result is not ok 33192 1726883090.36968: done checking to see if all hosts have failed 33192 1726883090.36969: getting the remaining hosts for this loop 33192 1726883090.36974: done getting the remaining hosts for this loop 33192 1726883090.36978: getting the next task for host managed_node1 33192 1726883090.36989: done getting next task for host managed_node1 33192 1726883090.36992: ^ task is: TASK: Include the task 'enable_epel.yml' 33192 1726883090.36996: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883090.37000: getting variables 33192 1726883090.37003: in VariableManager get_vars() 33192 1726883090.37035: Calling all_inventory to load vars for managed_node1 33192 1726883090.37157: Calling groups_inventory to load vars for managed_node1 33192 1726883090.37161: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883090.37175: Calling all_plugins_play to load vars for managed_node1 33192 1726883090.37179: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883090.37183: Calling groups_plugins_play to load vars for managed_node1 33192 1726883090.37490: done sending task result for task 0affe814-3a2d-6c15-6a7e-00000000015d 33192 1726883090.37493: WORKER PROCESS EXITING 33192 1726883090.37519: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883090.37877: done with get_vars() 33192 1726883090.37888: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 21:44:50 -0400 (0:00:00.030) 0:00:03.780 ****** 33192 1726883090.37999: entering _queue_task() for managed_node1/include_tasks 33192 1726883090.38449: worker is 1 (out of 1 available) 33192 1726883090.38460: exiting _queue_task() for managed_node1/include_tasks 33192 1726883090.38470: done queuing things up, now waiting for results queue to drain 33192 1726883090.38474: waiting for pending results... 33192 1726883090.38547: running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' 33192 1726883090.38675: in run() - task 0affe814-3a2d-6c15-6a7e-00000000015e 33192 1726883090.38709: variable 'ansible_search_path' from source: unknown 33192 1726883090.38720: variable 'ansible_search_path' from source: unknown 33192 1726883090.38764: calling self._execute() 33192 1726883090.38874: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883090.38891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883090.38921: variable 'omit' from source: magic vars 33192 1726883090.39591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 33192 1726883090.42854: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 33192 1726883090.42858: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 33192 1726883090.42886: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 33192 1726883090.42932: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 33192 1726883090.42987: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 33192 1726883090.43102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33192 1726883090.43146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33192 1726883090.43199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33192 1726883090.43263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33192 1726883090.43312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33192 1726883090.43508: variable '__network_is_ostree' from source: set_fact 33192 1726883090.43516: Evaluated conditional (not __network_is_ostree | d(false)): True 33192 1726883090.43519: _execute() done 33192 1726883090.43521: dumping result to json 33192 1726883090.43527: done dumping result, returning 33192 1726883090.43532: done running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' [0affe814-3a2d-6c15-6a7e-00000000015e] 33192 1726883090.43536: sending task result for task 0affe814-3a2d-6c15-6a7e-00000000015e 33192 1726883090.43844: done sending task result for task 0affe814-3a2d-6c15-6a7e-00000000015e 33192 1726883090.43847: WORKER PROCESS EXITING 33192 1726883090.43881: no more pending results, returning what we have 33192 1726883090.43886: in VariableManager get_vars() 33192 1726883090.43917: Calling all_inventory to load vars for managed_node1 33192 1726883090.43920: Calling groups_inventory to load vars for managed_node1 33192 1726883090.43923: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883090.43936: Calling all_plugins_play to load vars for managed_node1 33192 1726883090.43940: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883090.43944: Calling groups_plugins_play to load vars for managed_node1 33192 1726883090.44287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883090.44879: done with get_vars() 33192 1726883090.44888: variable 'ansible_search_path' from source: unknown 33192 1726883090.44890: variable 'ansible_search_path' from source: unknown 33192 1726883090.44945: we have included files to process 33192 1726883090.44947: generating all_blocks data 33192 1726883090.44948: done generating all_blocks data 33192 1726883090.44953: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 33192 1726883090.44954: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 33192 1726883090.44957: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 33192 1726883090.45940: done processing included file 33192 1726883090.45943: iterating over new_blocks loaded from include file 33192 1726883090.45944: in VariableManager get_vars() 33192 1726883090.45958: done with get_vars() 33192 1726883090.45959: filtering new block on tags 33192 1726883090.45989: done filtering new block on tags 33192 1726883090.45992: in VariableManager get_vars() 33192 1726883090.46009: done with get_vars() 33192 1726883090.46017: filtering new block on tags 33192 1726883090.46033: done filtering new block on tags 33192 1726883090.46037: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node1 33192 1726883090.46043: extending task lists for all hosts with included blocks 33192 1726883090.46191: done extending task lists 33192 1726883090.46193: done processing included files 33192 1726883090.46194: results queue empty 33192 1726883090.46195: checking for any_errors_fatal 33192 1726883090.46198: done checking for any_errors_fatal 33192 1726883090.46199: checking for max_fail_percentage 33192 1726883090.46200: done checking for max_fail_percentage 33192 1726883090.46201: checking to see if all hosts have failed and the running result is not ok 33192 1726883090.46202: done checking to see if all hosts have failed 33192 1726883090.46203: getting the remaining hosts for this loop 33192 1726883090.46204: done getting the remaining hosts for this loop 33192 1726883090.46207: getting the next task for host managed_node1 33192 1726883090.46212: done getting next task for host managed_node1 33192 1726883090.46214: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 33192 1726883090.46222: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883090.46225: getting variables 33192 1726883090.46226: in VariableManager get_vars() 33192 1726883090.46243: Calling all_inventory to load vars for managed_node1 33192 1726883090.46246: Calling groups_inventory to load vars for managed_node1 33192 1726883090.46249: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883090.46254: Calling all_plugins_play to load vars for managed_node1 33192 1726883090.46262: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883090.46266: Calling groups_plugins_play to load vars for managed_node1 33192 1726883090.46518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883090.46832: done with get_vars() 33192 1726883090.46844: done getting variables 33192 1726883090.46919: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 33192 1726883090.47138: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 39] ********************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 21:44:50 -0400 (0:00:00.091) 0:00:03.872 ****** 33192 1726883090.47188: entering _queue_task() for managed_node1/command 33192 1726883090.47190: Creating lock for command 33192 1726883090.47465: worker is 1 (out of 1 available) 33192 1726883090.47480: exiting _queue_task() for managed_node1/command 33192 1726883090.47497: done queuing things up, now waiting for results queue to drain 33192 1726883090.47498: waiting for pending results... 33192 1726883090.47706: running TaskExecutor() for managed_node1/TASK: Create EPEL 39 33192 1726883090.47797: in run() - task 0affe814-3a2d-6c15-6a7e-000000000178 33192 1726883090.47809: variable 'ansible_search_path' from source: unknown 33192 1726883090.47813: variable 'ansible_search_path' from source: unknown 33192 1726883090.47850: calling self._execute() 33192 1726883090.47912: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883090.47919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883090.47930: variable 'omit' from source: magic vars 33192 1726883090.48249: variable 'ansible_distribution' from source: facts 33192 1726883090.48257: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 33192 1726883090.48263: when evaluation is False, skipping this task 33192 1726883090.48266: _execute() done 33192 1726883090.48269: dumping result to json 33192 1726883090.48277: done dumping result, returning 33192 1726883090.48283: done running TaskExecutor() for managed_node1/TASK: Create EPEL 39 [0affe814-3a2d-6c15-6a7e-000000000178] 33192 1726883090.48288: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000178 33192 1726883090.48396: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000178 33192 1726883090.48401: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 33192 1726883090.48457: no more pending results, returning what we have 33192 1726883090.48460: results queue empty 33192 1726883090.48461: checking for any_errors_fatal 33192 1726883090.48463: done checking for any_errors_fatal 33192 1726883090.48464: checking for max_fail_percentage 33192 1726883090.48465: done checking for max_fail_percentage 33192 1726883090.48466: checking to see if all hosts have failed and the running result is not ok 33192 1726883090.48467: done checking to see if all hosts have failed 33192 1726883090.48468: getting the remaining hosts for this loop 33192 1726883090.48469: done getting the remaining hosts for this loop 33192 1726883090.48475: getting the next task for host managed_node1 33192 1726883090.48481: done getting next task for host managed_node1 33192 1726883090.48484: ^ task is: TASK: Install yum-utils package 33192 1726883090.48488: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883090.48491: getting variables 33192 1726883090.48493: in VariableManager get_vars() 33192 1726883090.48517: Calling all_inventory to load vars for managed_node1 33192 1726883090.48520: Calling groups_inventory to load vars for managed_node1 33192 1726883090.48523: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883090.48530: Calling all_plugins_play to load vars for managed_node1 33192 1726883090.48532: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883090.48537: Calling groups_plugins_play to load vars for managed_node1 33192 1726883090.48705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883090.48889: done with get_vars() 33192 1726883090.48897: done getting variables 33192 1726883090.48976: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 21:44:50 -0400 (0:00:00.018) 0:00:03.890 ****** 33192 1726883090.48998: entering _queue_task() for managed_node1/package 33192 1726883090.49000: Creating lock for package 33192 1726883090.49189: worker is 1 (out of 1 available) 33192 1726883090.49203: exiting _queue_task() for managed_node1/package 33192 1726883090.49217: done queuing things up, now waiting for results queue to drain 33192 1726883090.49218: waiting for pending results... 33192 1726883090.49401: running TaskExecutor() for managed_node1/TASK: Install yum-utils package 33192 1726883090.49504: in run() - task 0affe814-3a2d-6c15-6a7e-000000000179 33192 1726883090.49516: variable 'ansible_search_path' from source: unknown 33192 1726883090.49520: variable 'ansible_search_path' from source: unknown 33192 1726883090.49574: calling self._execute() 33192 1726883090.49644: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883090.49658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883090.49839: variable 'omit' from source: magic vars 33192 1726883090.50125: variable 'ansible_distribution' from source: facts 33192 1726883090.50144: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 33192 1726883090.50153: when evaluation is False, skipping this task 33192 1726883090.50160: _execute() done 33192 1726883090.50168: dumping result to json 33192 1726883090.50185: done dumping result, returning 33192 1726883090.50195: done running TaskExecutor() for managed_node1/TASK: Install yum-utils package [0affe814-3a2d-6c15-6a7e-000000000179] 33192 1726883090.50205: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000179 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 33192 1726883090.50365: no more pending results, returning what we have 33192 1726883090.50369: results queue empty 33192 1726883090.50371: checking for any_errors_fatal 33192 1726883090.50378: done checking for any_errors_fatal 33192 1726883090.50379: checking for max_fail_percentage 33192 1726883090.50381: done checking for max_fail_percentage 33192 1726883090.50382: checking to see if all hosts have failed and the running result is not ok 33192 1726883090.50383: done checking to see if all hosts have failed 33192 1726883090.50384: getting the remaining hosts for this loop 33192 1726883090.50386: done getting the remaining hosts for this loop 33192 1726883090.50446: getting the next task for host managed_node1 33192 1726883090.50456: done getting next task for host managed_node1 33192 1726883090.50459: ^ task is: TASK: Enable EPEL 7 33192 1726883090.50464: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883090.50468: getting variables 33192 1726883090.50470: in VariableManager get_vars() 33192 1726883090.50519: Calling all_inventory to load vars for managed_node1 33192 1726883090.50522: Calling groups_inventory to load vars for managed_node1 33192 1726883090.50526: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883090.50538: Calling all_plugins_play to load vars for managed_node1 33192 1726883090.50542: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883090.50547: Calling groups_plugins_play to load vars for managed_node1 33192 1726883090.50752: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000179 33192 1726883090.50755: WORKER PROCESS EXITING 33192 1726883090.50776: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883090.51099: done with get_vars() 33192 1726883090.51111: done getting variables 33192 1726883090.51183: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 21:44:50 -0400 (0:00:00.022) 0:00:03.912 ****** 33192 1726883090.51213: entering _queue_task() for managed_node1/command 33192 1726883090.51447: worker is 1 (out of 1 available) 33192 1726883090.51460: exiting _queue_task() for managed_node1/command 33192 1726883090.51472: done queuing things up, now waiting for results queue to drain 33192 1726883090.51473: waiting for pending results... 33192 1726883090.51740: running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 33192 1726883090.51842: in run() - task 0affe814-3a2d-6c15-6a7e-00000000017a 33192 1726883090.51853: variable 'ansible_search_path' from source: unknown 33192 1726883090.51857: variable 'ansible_search_path' from source: unknown 33192 1726883090.51886: calling self._execute() 33192 1726883090.51950: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883090.51957: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883090.51968: variable 'omit' from source: magic vars 33192 1726883090.52259: variable 'ansible_distribution' from source: facts 33192 1726883090.52269: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 33192 1726883090.52275: when evaluation is False, skipping this task 33192 1726883090.52278: _execute() done 33192 1726883090.52281: dumping result to json 33192 1726883090.52284: done dumping result, returning 33192 1726883090.52291: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 [0affe814-3a2d-6c15-6a7e-00000000017a] 33192 1726883090.52297: sending task result for task 0affe814-3a2d-6c15-6a7e-00000000017a 33192 1726883090.52385: done sending task result for task 0affe814-3a2d-6c15-6a7e-00000000017a 33192 1726883090.52388: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 33192 1726883090.52440: no more pending results, returning what we have 33192 1726883090.52444: results queue empty 33192 1726883090.52445: checking for any_errors_fatal 33192 1726883090.52450: done checking for any_errors_fatal 33192 1726883090.52451: checking for max_fail_percentage 33192 1726883090.52453: done checking for max_fail_percentage 33192 1726883090.52454: checking to see if all hosts have failed and the running result is not ok 33192 1726883090.52455: done checking to see if all hosts have failed 33192 1726883090.52455: getting the remaining hosts for this loop 33192 1726883090.52457: done getting the remaining hosts for this loop 33192 1726883090.52460: getting the next task for host managed_node1 33192 1726883090.52466: done getting next task for host managed_node1 33192 1726883090.52468: ^ task is: TASK: Enable EPEL 8 33192 1726883090.52475: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883090.52478: getting variables 33192 1726883090.52480: in VariableManager get_vars() 33192 1726883090.52504: Calling all_inventory to load vars for managed_node1 33192 1726883090.52507: Calling groups_inventory to load vars for managed_node1 33192 1726883090.52510: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883090.52517: Calling all_plugins_play to load vars for managed_node1 33192 1726883090.52519: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883090.52522: Calling groups_plugins_play to load vars for managed_node1 33192 1726883090.52696: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883090.52878: done with get_vars() 33192 1726883090.52886: done getting variables 33192 1726883090.52927: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 21:44:50 -0400 (0:00:00.017) 0:00:03.930 ****** 33192 1726883090.52953: entering _queue_task() for managed_node1/command 33192 1726883090.53126: worker is 1 (out of 1 available) 33192 1726883090.53140: exiting _queue_task() for managed_node1/command 33192 1726883090.53153: done queuing things up, now waiting for results queue to drain 33192 1726883090.53154: waiting for pending results... 33192 1726883090.53300: running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 33192 1726883090.53362: in run() - task 0affe814-3a2d-6c15-6a7e-00000000017b 33192 1726883090.53377: variable 'ansible_search_path' from source: unknown 33192 1726883090.53380: variable 'ansible_search_path' from source: unknown 33192 1726883090.53409: calling self._execute() 33192 1726883090.53464: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883090.53473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883090.53482: variable 'omit' from source: magic vars 33192 1726883090.53848: variable 'ansible_distribution' from source: facts 33192 1726883090.53853: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 33192 1726883090.53864: when evaluation is False, skipping this task 33192 1726883090.53867: _execute() done 33192 1726883090.53870: dumping result to json 33192 1726883090.53875: done dumping result, returning 33192 1726883090.53878: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 [0affe814-3a2d-6c15-6a7e-00000000017b] 33192 1726883090.53881: sending task result for task 0affe814-3a2d-6c15-6a7e-00000000017b 33192 1726883090.54096: done sending task result for task 0affe814-3a2d-6c15-6a7e-00000000017b 33192 1726883090.54099: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 33192 1726883090.54196: no more pending results, returning what we have 33192 1726883090.54199: results queue empty 33192 1726883090.54200: checking for any_errors_fatal 33192 1726883090.54204: done checking for any_errors_fatal 33192 1726883090.54205: checking for max_fail_percentage 33192 1726883090.54207: done checking for max_fail_percentage 33192 1726883090.54208: checking to see if all hosts have failed and the running result is not ok 33192 1726883090.54209: done checking to see if all hosts have failed 33192 1726883090.54210: getting the remaining hosts for this loop 33192 1726883090.54212: done getting the remaining hosts for this loop 33192 1726883090.54215: getting the next task for host managed_node1 33192 1726883090.54224: done getting next task for host managed_node1 33192 1726883090.54226: ^ task is: TASK: Enable EPEL 6 33192 1726883090.54230: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883090.54236: getting variables 33192 1726883090.54238: in VariableManager get_vars() 33192 1726883090.54265: Calling all_inventory to load vars for managed_node1 33192 1726883090.54268: Calling groups_inventory to load vars for managed_node1 33192 1726883090.54274: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883090.54283: Calling all_plugins_play to load vars for managed_node1 33192 1726883090.54287: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883090.54291: Calling groups_plugins_play to load vars for managed_node1 33192 1726883090.54588: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883090.54785: done with get_vars() 33192 1726883090.54794: done getting variables 33192 1726883090.54837: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 21:44:50 -0400 (0:00:00.019) 0:00:03.949 ****** 33192 1726883090.54858: entering _queue_task() for managed_node1/copy 33192 1726883090.55023: worker is 1 (out of 1 available) 33192 1726883090.55037: exiting _queue_task() for managed_node1/copy 33192 1726883090.55048: done queuing things up, now waiting for results queue to drain 33192 1726883090.55049: waiting for pending results... 33192 1726883090.55203: running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 33192 1726883090.55275: in run() - task 0affe814-3a2d-6c15-6a7e-00000000017d 33192 1726883090.55293: variable 'ansible_search_path' from source: unknown 33192 1726883090.55297: variable 'ansible_search_path' from source: unknown 33192 1726883090.55327: calling self._execute() 33192 1726883090.55397: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883090.55402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883090.55412: variable 'omit' from source: magic vars 33192 1726883090.55768: variable 'ansible_distribution' from source: facts 33192 1726883090.55782: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 33192 1726883090.55786: when evaluation is False, skipping this task 33192 1726883090.55789: _execute() done 33192 1726883090.55791: dumping result to json 33192 1726883090.55796: done dumping result, returning 33192 1726883090.55803: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 [0affe814-3a2d-6c15-6a7e-00000000017d] 33192 1726883090.55808: sending task result for task 0affe814-3a2d-6c15-6a7e-00000000017d 33192 1726883090.55898: done sending task result for task 0affe814-3a2d-6c15-6a7e-00000000017d 33192 1726883090.55901: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 33192 1726883090.55973: no more pending results, returning what we have 33192 1726883090.55976: results queue empty 33192 1726883090.55977: checking for any_errors_fatal 33192 1726883090.55982: done checking for any_errors_fatal 33192 1726883090.55983: checking for max_fail_percentage 33192 1726883090.55984: done checking for max_fail_percentage 33192 1726883090.55985: checking to see if all hosts have failed and the running result is not ok 33192 1726883090.55987: done checking to see if all hosts have failed 33192 1726883090.55987: getting the remaining hosts for this loop 33192 1726883090.55989: done getting the remaining hosts for this loop 33192 1726883090.55993: getting the next task for host managed_node1 33192 1726883090.56000: done getting next task for host managed_node1 33192 1726883090.56002: ^ task is: TASK: Set network provider to 'nm' 33192 1726883090.56004: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883090.56007: getting variables 33192 1726883090.56008: in VariableManager get_vars() 33192 1726883090.56031: Calling all_inventory to load vars for managed_node1 33192 1726883090.56033: Calling groups_inventory to load vars for managed_node1 33192 1726883090.56037: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883090.56044: Calling all_plugins_play to load vars for managed_node1 33192 1726883090.56046: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883090.56049: Calling groups_plugins_play to load vars for managed_node1 33192 1726883090.56218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883090.56394: done with get_vars() 33192 1726883090.56401: done getting variables 33192 1726883090.56448: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:13 Friday 20 September 2024 21:44:50 -0400 (0:00:00.016) 0:00:03.965 ****** 33192 1726883090.56468: entering _queue_task() for managed_node1/set_fact 33192 1726883090.56628: worker is 1 (out of 1 available) 33192 1726883090.56642: exiting _queue_task() for managed_node1/set_fact 33192 1726883090.56654: done queuing things up, now waiting for results queue to drain 33192 1726883090.56655: waiting for pending results... 33192 1726883090.56798: running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' 33192 1726883090.56846: in run() - task 0affe814-3a2d-6c15-6a7e-000000000007 33192 1726883090.56858: variable 'ansible_search_path' from source: unknown 33192 1726883090.56891: calling self._execute() 33192 1726883090.56955: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883090.56962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883090.56975: variable 'omit' from source: magic vars 33192 1726883090.57061: variable 'omit' from source: magic vars 33192 1726883090.57087: variable 'omit' from source: magic vars 33192 1726883090.57123: variable 'omit' from source: magic vars 33192 1726883090.57159: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33192 1726883090.57192: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33192 1726883090.57210: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33192 1726883090.57230: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33192 1726883090.57242: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33192 1726883090.57270: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33192 1726883090.57276: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883090.57279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883090.57363: Set connection var ansible_shell_type to sh 33192 1726883090.57374: Set connection var ansible_connection to ssh 33192 1726883090.57383: Set connection var ansible_timeout to 10 33192 1726883090.57390: Set connection var ansible_module_compression to ZIP_DEFLATED 33192 1726883090.57396: Set connection var ansible_pipelining to False 33192 1726883090.57402: Set connection var ansible_shell_executable to /bin/sh 33192 1726883090.57427: variable 'ansible_shell_executable' from source: unknown 33192 1726883090.57431: variable 'ansible_connection' from source: unknown 33192 1726883090.57434: variable 'ansible_module_compression' from source: unknown 33192 1726883090.57436: variable 'ansible_shell_type' from source: unknown 33192 1726883090.57439: variable 'ansible_shell_executable' from source: unknown 33192 1726883090.57449: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883090.57452: variable 'ansible_pipelining' from source: unknown 33192 1726883090.57454: variable 'ansible_timeout' from source: unknown 33192 1726883090.57456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883090.57574: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 33192 1726883090.57582: variable 'omit' from source: magic vars 33192 1726883090.57588: starting attempt loop 33192 1726883090.57592: running the handler 33192 1726883090.57602: handler run complete 33192 1726883090.57612: attempt loop complete, returning result 33192 1726883090.57615: _execute() done 33192 1726883090.57617: dumping result to json 33192 1726883090.57623: done dumping result, returning 33192 1726883090.57628: done running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' [0affe814-3a2d-6c15-6a7e-000000000007] 33192 1726883090.57639: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000007 33192 1726883090.57726: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000007 33192 1726883090.57729: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 33192 1726883090.57802: no more pending results, returning what we have 33192 1726883090.57805: results queue empty 33192 1726883090.57806: checking for any_errors_fatal 33192 1726883090.57811: done checking for any_errors_fatal 33192 1726883090.57812: checking for max_fail_percentage 33192 1726883090.57813: done checking for max_fail_percentage 33192 1726883090.57814: checking to see if all hosts have failed and the running result is not ok 33192 1726883090.57815: done checking to see if all hosts have failed 33192 1726883090.57815: getting the remaining hosts for this loop 33192 1726883090.57816: done getting the remaining hosts for this loop 33192 1726883090.57819: getting the next task for host managed_node1 33192 1726883090.57823: done getting next task for host managed_node1 33192 1726883090.57824: ^ task is: TASK: meta (flush_handlers) 33192 1726883090.57826: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883090.57828: getting variables 33192 1726883090.57830: in VariableManager get_vars() 33192 1726883090.57851: Calling all_inventory to load vars for managed_node1 33192 1726883090.57853: Calling groups_inventory to load vars for managed_node1 33192 1726883090.57857: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883090.57865: Calling all_plugins_play to load vars for managed_node1 33192 1726883090.57867: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883090.57869: Calling groups_plugins_play to load vars for managed_node1 33192 1726883090.58019: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883090.58216: done with get_vars() 33192 1726883090.58224: done getting variables 33192 1726883090.58274: in VariableManager get_vars() 33192 1726883090.58281: Calling all_inventory to load vars for managed_node1 33192 1726883090.58283: Calling groups_inventory to load vars for managed_node1 33192 1726883090.58285: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883090.58288: Calling all_plugins_play to load vars for managed_node1 33192 1726883090.58291: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883090.58294: Calling groups_plugins_play to load vars for managed_node1 33192 1726883090.58417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883090.58593: done with get_vars() 33192 1726883090.58603: done queuing things up, now waiting for results queue to drain 33192 1726883090.58605: results queue empty 33192 1726883090.58605: checking for any_errors_fatal 33192 1726883090.58607: done checking for any_errors_fatal 33192 1726883090.58607: checking for max_fail_percentage 33192 1726883090.58608: done checking for max_fail_percentage 33192 1726883090.58609: checking to see if all hosts have failed and the running result is not ok 33192 1726883090.58609: done checking to see if all hosts have failed 33192 1726883090.58610: getting the remaining hosts for this loop 33192 1726883090.58610: done getting the remaining hosts for this loop 33192 1726883090.58612: getting the next task for host managed_node1 33192 1726883090.58616: done getting next task for host managed_node1 33192 1726883090.58617: ^ task is: TASK: meta (flush_handlers) 33192 1726883090.58619: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883090.58624: getting variables 33192 1726883090.58625: in VariableManager get_vars() 33192 1726883090.58631: Calling all_inventory to load vars for managed_node1 33192 1726883090.58633: Calling groups_inventory to load vars for managed_node1 33192 1726883090.58637: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883090.58642: Calling all_plugins_play to load vars for managed_node1 33192 1726883090.58643: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883090.58646: Calling groups_plugins_play to load vars for managed_node1 33192 1726883090.58769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883090.58965: done with get_vars() 33192 1726883090.58974: done getting variables 33192 1726883090.59007: in VariableManager get_vars() 33192 1726883090.59013: Calling all_inventory to load vars for managed_node1 33192 1726883090.59015: Calling groups_inventory to load vars for managed_node1 33192 1726883090.59016: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883090.59020: Calling all_plugins_play to load vars for managed_node1 33192 1726883090.59021: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883090.59024: Calling groups_plugins_play to load vars for managed_node1 33192 1726883090.59145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883090.59318: done with get_vars() 33192 1726883090.59327: done queuing things up, now waiting for results queue to drain 33192 1726883090.59329: results queue empty 33192 1726883090.59329: checking for any_errors_fatal 33192 1726883090.59330: done checking for any_errors_fatal 33192 1726883090.59331: checking for max_fail_percentage 33192 1726883090.59331: done checking for max_fail_percentage 33192 1726883090.59332: checking to see if all hosts have failed and the running result is not ok 33192 1726883090.59333: done checking to see if all hosts have failed 33192 1726883090.59333: getting the remaining hosts for this loop 33192 1726883090.59336: done getting the remaining hosts for this loop 33192 1726883090.59337: getting the next task for host managed_node1 33192 1726883090.59339: done getting next task for host managed_node1 33192 1726883090.59340: ^ task is: None 33192 1726883090.59341: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883090.59342: done queuing things up, now waiting for results queue to drain 33192 1726883090.59343: results queue empty 33192 1726883090.59343: checking for any_errors_fatal 33192 1726883090.59344: done checking for any_errors_fatal 33192 1726883090.59344: checking for max_fail_percentage 33192 1726883090.59345: done checking for max_fail_percentage 33192 1726883090.59346: checking to see if all hosts have failed and the running result is not ok 33192 1726883090.59346: done checking to see if all hosts have failed 33192 1726883090.59347: getting the next task for host managed_node1 33192 1726883090.59349: done getting next task for host managed_node1 33192 1726883090.59349: ^ task is: None 33192 1726883090.59350: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883090.59386: in VariableManager get_vars() 33192 1726883090.59412: done with get_vars() 33192 1726883090.59417: in VariableManager get_vars() 33192 1726883090.59431: done with get_vars() 33192 1726883090.59436: variable 'omit' from source: magic vars 33192 1726883090.59459: in VariableManager get_vars() 33192 1726883090.59477: done with get_vars() 33192 1726883090.59494: variable 'omit' from source: magic vars PLAY [Play for testing wireless connection] ************************************ 33192 1726883090.60103: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 33192 1726883090.60124: getting the remaining hosts for this loop 33192 1726883090.60125: done getting the remaining hosts for this loop 33192 1726883090.60127: getting the next task for host managed_node1 33192 1726883090.60129: done getting next task for host managed_node1 33192 1726883090.60131: ^ task is: TASK: Gathering Facts 33192 1726883090.60132: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883090.60133: getting variables 33192 1726883090.60136: in VariableManager get_vars() 33192 1726883090.60151: Calling all_inventory to load vars for managed_node1 33192 1726883090.60152: Calling groups_inventory to load vars for managed_node1 33192 1726883090.60154: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883090.60159: Calling all_plugins_play to load vars for managed_node1 33192 1726883090.60171: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883090.60174: Calling groups_plugins_play to load vars for managed_node1 33192 1726883090.60314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883090.60495: done with get_vars() 33192 1726883090.60502: done getting variables 33192 1726883090.60531: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:3 Friday 20 September 2024 21:44:50 -0400 (0:00:00.040) 0:00:04.005 ****** 33192 1726883090.60550: entering _queue_task() for managed_node1/gather_facts 33192 1726883090.60701: worker is 1 (out of 1 available) 33192 1726883090.60713: exiting _queue_task() for managed_node1/gather_facts 33192 1726883090.60723: done queuing things up, now waiting for results queue to drain 33192 1726883090.60725: waiting for pending results... 33192 1726883090.60873: running TaskExecutor() for managed_node1/TASK: Gathering Facts 33192 1726883090.60939: in run() - task 0affe814-3a2d-6c15-6a7e-0000000001a3 33192 1726883090.60954: variable 'ansible_search_path' from source: unknown 33192 1726883090.60986: calling self._execute() 33192 1726883090.61048: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883090.61056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883090.61068: variable 'omit' from source: magic vars 33192 1726883090.61362: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.61372: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883090.61472: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.61480: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883090.61483: when evaluation is False, skipping this task 33192 1726883090.61488: _execute() done 33192 1726883090.61491: dumping result to json 33192 1726883090.61496: done dumping result, returning 33192 1726883090.61507: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0affe814-3a2d-6c15-6a7e-0000000001a3] 33192 1726883090.61510: sending task result for task 0affe814-3a2d-6c15-6a7e-0000000001a3 33192 1726883090.61593: done sending task result for task 0affe814-3a2d-6c15-6a7e-0000000001a3 33192 1726883090.61596: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883090.61655: no more pending results, returning what we have 33192 1726883090.61658: results queue empty 33192 1726883090.61660: checking for any_errors_fatal 33192 1726883090.61661: done checking for any_errors_fatal 33192 1726883090.61662: checking for max_fail_percentage 33192 1726883090.61663: done checking for max_fail_percentage 33192 1726883090.61664: checking to see if all hosts have failed and the running result is not ok 33192 1726883090.61665: done checking to see if all hosts have failed 33192 1726883090.61666: getting the remaining hosts for this loop 33192 1726883090.61668: done getting the remaining hosts for this loop 33192 1726883090.61671: getting the next task for host managed_node1 33192 1726883090.61676: done getting next task for host managed_node1 33192 1726883090.61678: ^ task is: TASK: meta (flush_handlers) 33192 1726883090.61681: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883090.61685: getting variables 33192 1726883090.61686: in VariableManager get_vars() 33192 1726883090.61723: Calling all_inventory to load vars for managed_node1 33192 1726883090.61725: Calling groups_inventory to load vars for managed_node1 33192 1726883090.61727: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883090.61736: Calling all_plugins_play to load vars for managed_node1 33192 1726883090.61739: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883090.61741: Calling groups_plugins_play to load vars for managed_node1 33192 1726883090.61884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883090.62064: done with get_vars() 33192 1726883090.62072: done getting variables 33192 1726883090.62120: in VariableManager get_vars() 33192 1726883090.62133: Calling all_inventory to load vars for managed_node1 33192 1726883090.62137: Calling groups_inventory to load vars for managed_node1 33192 1726883090.62140: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883090.62143: Calling all_plugins_play to load vars for managed_node1 33192 1726883090.62145: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883090.62149: Calling groups_plugins_play to load vars for managed_node1 33192 1726883090.62289: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883090.62461: done with get_vars() 33192 1726883090.62474: done queuing things up, now waiting for results queue to drain 33192 1726883090.62476: results queue empty 33192 1726883090.62477: checking for any_errors_fatal 33192 1726883090.62479: done checking for any_errors_fatal 33192 1726883090.62481: checking for max_fail_percentage 33192 1726883090.62481: done checking for max_fail_percentage 33192 1726883090.62482: checking to see if all hosts have failed and the running result is not ok 33192 1726883090.62483: done checking to see if all hosts have failed 33192 1726883090.62483: getting the remaining hosts for this loop 33192 1726883090.62484: done getting the remaining hosts for this loop 33192 1726883090.62486: getting the next task for host managed_node1 33192 1726883090.62488: done getting next task for host managed_node1 33192 1726883090.62490: ^ task is: TASK: INIT: wireless tests 33192 1726883090.62491: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883090.62492: getting variables 33192 1726883090.62493: in VariableManager get_vars() 33192 1726883090.62505: Calling all_inventory to load vars for managed_node1 33192 1726883090.62506: Calling groups_inventory to load vars for managed_node1 33192 1726883090.62508: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883090.62512: Calling all_plugins_play to load vars for managed_node1 33192 1726883090.62513: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883090.62515: Calling groups_plugins_play to load vars for managed_node1 33192 1726883090.62639: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883090.62812: done with get_vars() 33192 1726883090.62819: done getting variables 33192 1726883090.62878: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [INIT: wireless tests] **************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:8 Friday 20 September 2024 21:44:50 -0400 (0:00:00.023) 0:00:04.029 ****** 33192 1726883090.62897: entering _queue_task() for managed_node1/debug 33192 1726883090.62898: Creating lock for debug 33192 1726883090.63079: worker is 1 (out of 1 available) 33192 1726883090.63091: exiting _queue_task() for managed_node1/debug 33192 1726883090.63103: done queuing things up, now waiting for results queue to drain 33192 1726883090.63104: waiting for pending results... 33192 1726883090.63261: running TaskExecutor() for managed_node1/TASK: INIT: wireless tests 33192 1726883090.63326: in run() - task 0affe814-3a2d-6c15-6a7e-00000000000b 33192 1726883090.63339: variable 'ansible_search_path' from source: unknown 33192 1726883090.63374: calling self._execute() 33192 1726883090.63437: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883090.63444: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883090.63456: variable 'omit' from source: magic vars 33192 1726883090.63732: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.63744: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883090.63844: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.63850: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883090.63853: when evaluation is False, skipping this task 33192 1726883090.63857: _execute() done 33192 1726883090.63861: dumping result to json 33192 1726883090.63866: done dumping result, returning 33192 1726883090.63877: done running TaskExecutor() for managed_node1/TASK: INIT: wireless tests [0affe814-3a2d-6c15-6a7e-00000000000b] 33192 1726883090.63880: sending task result for task 0affe814-3a2d-6c15-6a7e-00000000000b 33192 1726883090.63970: done sending task result for task 0affe814-3a2d-6c15-6a7e-00000000000b 33192 1726883090.63976: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 33192 1726883090.64031: no more pending results, returning what we have 33192 1726883090.64036: results queue empty 33192 1726883090.64037: checking for any_errors_fatal 33192 1726883090.64039: done checking for any_errors_fatal 33192 1726883090.64040: checking for max_fail_percentage 33192 1726883090.64041: done checking for max_fail_percentage 33192 1726883090.64042: checking to see if all hosts have failed and the running result is not ok 33192 1726883090.64043: done checking to see if all hosts have failed 33192 1726883090.64044: getting the remaining hosts for this loop 33192 1726883090.64046: done getting the remaining hosts for this loop 33192 1726883090.64050: getting the next task for host managed_node1 33192 1726883090.64055: done getting next task for host managed_node1 33192 1726883090.64057: ^ task is: TASK: Include the task 'setup_mock_wifi.yml' 33192 1726883090.64060: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883090.64064: getting variables 33192 1726883090.64065: in VariableManager get_vars() 33192 1726883090.64102: Calling all_inventory to load vars for managed_node1 33192 1726883090.64105: Calling groups_inventory to load vars for managed_node1 33192 1726883090.64107: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883090.64114: Calling all_plugins_play to load vars for managed_node1 33192 1726883090.64116: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883090.64118: Calling groups_plugins_play to load vars for managed_node1 33192 1726883090.64263: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883090.64476: done with get_vars() 33192 1726883090.64484: done getting variables TASK [Include the task 'setup_mock_wifi.yml'] ********************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:11 Friday 20 September 2024 21:44:50 -0400 (0:00:00.016) 0:00:04.046 ****** 33192 1726883090.64552: entering _queue_task() for managed_node1/include_tasks 33192 1726883090.64717: worker is 1 (out of 1 available) 33192 1726883090.64728: exiting _queue_task() for managed_node1/include_tasks 33192 1726883090.64741: done queuing things up, now waiting for results queue to drain 33192 1726883090.64742: waiting for pending results... 33192 1726883090.64878: running TaskExecutor() for managed_node1/TASK: Include the task 'setup_mock_wifi.yml' 33192 1726883090.64936: in run() - task 0affe814-3a2d-6c15-6a7e-00000000000c 33192 1726883090.64947: variable 'ansible_search_path' from source: unknown 33192 1726883090.65001: calling self._execute() 33192 1726883090.65083: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883090.65087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883090.65090: variable 'omit' from source: magic vars 33192 1726883090.65327: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.65335: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883090.65430: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.65434: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883090.65445: when evaluation is False, skipping this task 33192 1726883090.65448: _execute() done 33192 1726883090.65452: dumping result to json 33192 1726883090.65454: done dumping result, returning 33192 1726883090.65457: done running TaskExecutor() for managed_node1/TASK: Include the task 'setup_mock_wifi.yml' [0affe814-3a2d-6c15-6a7e-00000000000c] 33192 1726883090.65464: sending task result for task 0affe814-3a2d-6c15-6a7e-00000000000c 33192 1726883090.65557: done sending task result for task 0affe814-3a2d-6c15-6a7e-00000000000c 33192 1726883090.65560: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883090.65609: no more pending results, returning what we have 33192 1726883090.65613: results queue empty 33192 1726883090.65614: checking for any_errors_fatal 33192 1726883090.65621: done checking for any_errors_fatal 33192 1726883090.65622: checking for max_fail_percentage 33192 1726883090.65624: done checking for max_fail_percentage 33192 1726883090.65625: checking to see if all hosts have failed and the running result is not ok 33192 1726883090.65626: done checking to see if all hosts have failed 33192 1726883090.65627: getting the remaining hosts for this loop 33192 1726883090.65628: done getting the remaining hosts for this loop 33192 1726883090.65632: getting the next task for host managed_node1 33192 1726883090.65638: done getting next task for host managed_node1 33192 1726883090.65641: ^ task is: TASK: Copy client certs 33192 1726883090.65643: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883090.65647: getting variables 33192 1726883090.65648: in VariableManager get_vars() 33192 1726883090.65688: Calling all_inventory to load vars for managed_node1 33192 1726883090.65690: Calling groups_inventory to load vars for managed_node1 33192 1726883090.65692: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883090.65699: Calling all_plugins_play to load vars for managed_node1 33192 1726883090.65701: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883090.65703: Calling groups_plugins_play to load vars for managed_node1 33192 1726883090.65850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883090.66030: done with get_vars() 33192 1726883090.66040: done getting variables 33192 1726883090.66084: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Copy client certs] ******************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:13 Friday 20 September 2024 21:44:50 -0400 (0:00:00.015) 0:00:04.061 ****** 33192 1726883090.66106: entering _queue_task() for managed_node1/copy 33192 1726883090.66274: worker is 1 (out of 1 available) 33192 1726883090.66288: exiting _queue_task() for managed_node1/copy 33192 1726883090.66299: done queuing things up, now waiting for results queue to drain 33192 1726883090.66300: waiting for pending results... 33192 1726883090.66429: running TaskExecutor() for managed_node1/TASK: Copy client certs 33192 1726883090.66490: in run() - task 0affe814-3a2d-6c15-6a7e-00000000000d 33192 1726883090.66501: variable 'ansible_search_path' from source: unknown 33192 1726883090.66669: Loaded config def from plugin (lookup/items) 33192 1726883090.66676: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 33192 1726883090.66705: variable 'omit' from source: magic vars 33192 1726883090.66794: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883090.66804: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883090.66814: variable 'omit' from source: magic vars 33192 1726883090.67106: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.67115: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883090.67212: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.67217: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883090.67221: when evaluation is False, skipping this task 33192 1726883090.67244: variable 'item' from source: unknown 33192 1726883090.67302: variable 'item' from source: unknown skipping: [managed_node1] => (item=client.key) => { "ansible_loop_var": "item", "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "item": "client.key", "skip_reason": "Conditional result was False" } 33192 1726883090.67444: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883090.67447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883090.67450: variable 'omit' from source: magic vars 33192 1726883090.67551: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.67555: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883090.67650: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.67654: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883090.67657: when evaluation is False, skipping this task 33192 1726883090.67683: variable 'item' from source: unknown 33192 1726883090.67732: variable 'item' from source: unknown skipping: [managed_node1] => (item=client.pem) => { "ansible_loop_var": "item", "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "item": "client.pem", "skip_reason": "Conditional result was False" } 33192 1726883090.67824: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883090.67828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883090.67831: variable 'omit' from source: magic vars 33192 1726883090.67957: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.67962: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883090.68056: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.68059: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883090.68062: when evaluation is False, skipping this task 33192 1726883090.68087: variable 'item' from source: unknown 33192 1726883090.68139: variable 'item' from source: unknown skipping: [managed_node1] => (item=cacert.pem) => { "ansible_loop_var": "item", "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "item": "cacert.pem", "skip_reason": "Conditional result was False" } 33192 1726883090.68219: dumping result to json 33192 1726883090.68222: done dumping result, returning 33192 1726883090.68225: done running TaskExecutor() for managed_node1/TASK: Copy client certs [0affe814-3a2d-6c15-6a7e-00000000000d] 33192 1726883090.68227: sending task result for task 0affe814-3a2d-6c15-6a7e-00000000000d 33192 1726883090.68269: done sending task result for task 0affe814-3a2d-6c15-6a7e-00000000000d 33192 1726883090.68274: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false } MSG: All items skipped 33192 1726883090.68326: no more pending results, returning what we have 33192 1726883090.68329: results queue empty 33192 1726883090.68330: checking for any_errors_fatal 33192 1726883090.68342: done checking for any_errors_fatal 33192 1726883090.68343: checking for max_fail_percentage 33192 1726883090.68345: done checking for max_fail_percentage 33192 1726883090.68346: checking to see if all hosts have failed and the running result is not ok 33192 1726883090.68347: done checking to see if all hosts have failed 33192 1726883090.68348: getting the remaining hosts for this loop 33192 1726883090.68350: done getting the remaining hosts for this loop 33192 1726883090.68353: getting the next task for host managed_node1 33192 1726883090.68359: done getting next task for host managed_node1 33192 1726883090.68361: ^ task is: TASK: TEST: wireless connection with WPA-PSK 33192 1726883090.68363: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883090.68366: getting variables 33192 1726883090.68368: in VariableManager get_vars() 33192 1726883090.68411: Calling all_inventory to load vars for managed_node1 33192 1726883090.68413: Calling groups_inventory to load vars for managed_node1 33192 1726883090.68415: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883090.68422: Calling all_plugins_play to load vars for managed_node1 33192 1726883090.68424: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883090.68426: Calling groups_plugins_play to load vars for managed_node1 33192 1726883090.68611: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883090.68792: done with get_vars() 33192 1726883090.68799: done getting variables 33192 1726883090.68846: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST: wireless connection with WPA-PSK] ********************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:24 Friday 20 September 2024 21:44:50 -0400 (0:00:00.027) 0:00:04.089 ****** 33192 1726883090.68866: entering _queue_task() for managed_node1/debug 33192 1726883090.69047: worker is 1 (out of 1 available) 33192 1726883090.69060: exiting _queue_task() for managed_node1/debug 33192 1726883090.69076: done queuing things up, now waiting for results queue to drain 33192 1726883090.69077: waiting for pending results... 33192 1726883090.69219: running TaskExecutor() for managed_node1/TASK: TEST: wireless connection with WPA-PSK 33192 1726883090.69290: in run() - task 0affe814-3a2d-6c15-6a7e-00000000000f 33192 1726883090.69307: variable 'ansible_search_path' from source: unknown 33192 1726883090.69345: calling self._execute() 33192 1726883090.69401: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883090.69414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883090.69419: variable 'omit' from source: magic vars 33192 1726883090.69700: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.69709: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883090.69808: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.69812: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883090.69818: when evaluation is False, skipping this task 33192 1726883090.69821: _execute() done 33192 1726883090.69825: dumping result to json 33192 1726883090.69830: done dumping result, returning 33192 1726883090.69838: done running TaskExecutor() for managed_node1/TASK: TEST: wireless connection with WPA-PSK [0affe814-3a2d-6c15-6a7e-00000000000f] 33192 1726883090.69843: sending task result for task 0affe814-3a2d-6c15-6a7e-00000000000f 33192 1726883090.69933: done sending task result for task 0affe814-3a2d-6c15-6a7e-00000000000f 33192 1726883090.69939: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 33192 1726883090.69998: no more pending results, returning what we have 33192 1726883090.70001: results queue empty 33192 1726883090.70003: checking for any_errors_fatal 33192 1726883090.70009: done checking for any_errors_fatal 33192 1726883090.70010: checking for max_fail_percentage 33192 1726883090.70011: done checking for max_fail_percentage 33192 1726883090.70012: checking to see if all hosts have failed and the running result is not ok 33192 1726883090.70013: done checking to see if all hosts have failed 33192 1726883090.70014: getting the remaining hosts for this loop 33192 1726883090.70016: done getting the remaining hosts for this loop 33192 1726883090.70019: getting the next task for host managed_node1 33192 1726883090.70025: done getting next task for host managed_node1 33192 1726883090.70029: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 33192 1726883090.70032: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883090.70050: getting variables 33192 1726883090.70051: in VariableManager get_vars() 33192 1726883090.70087: Calling all_inventory to load vars for managed_node1 33192 1726883090.70089: Calling groups_inventory to load vars for managed_node1 33192 1726883090.70091: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883090.70098: Calling all_plugins_play to load vars for managed_node1 33192 1726883090.70100: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883090.70102: Calling groups_plugins_play to load vars for managed_node1 33192 1726883090.70247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883090.70592: done with get_vars() 33192 1726883090.70601: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:44:50 -0400 (0:00:00.018) 0:00:04.107 ****** 33192 1726883090.70669: entering _queue_task() for managed_node1/include_tasks 33192 1726883090.70846: worker is 1 (out of 1 available) 33192 1726883090.70859: exiting _queue_task() for managed_node1/include_tasks 33192 1726883090.70873: done queuing things up, now waiting for results queue to drain 33192 1726883090.70875: waiting for pending results... 33192 1726883090.71019: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 33192 1726883090.71113: in run() - task 0affe814-3a2d-6c15-6a7e-000000000017 33192 1726883090.71121: variable 'ansible_search_path' from source: unknown 33192 1726883090.71124: variable 'ansible_search_path' from source: unknown 33192 1726883090.71155: calling self._execute() 33192 1726883090.71214: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883090.71228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883090.71325: variable 'omit' from source: magic vars 33192 1726883090.71518: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.71527: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883090.71623: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.71627: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883090.71632: when evaluation is False, skipping this task 33192 1726883090.71637: _execute() done 33192 1726883090.71640: dumping result to json 33192 1726883090.71647: done dumping result, returning 33192 1726883090.71657: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affe814-3a2d-6c15-6a7e-000000000017] 33192 1726883090.71661: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000017 33192 1726883090.71757: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000017 33192 1726883090.71760: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883090.71809: no more pending results, returning what we have 33192 1726883090.71813: results queue empty 33192 1726883090.71814: checking for any_errors_fatal 33192 1726883090.71821: done checking for any_errors_fatal 33192 1726883090.71822: checking for max_fail_percentage 33192 1726883090.71824: done checking for max_fail_percentage 33192 1726883090.71824: checking to see if all hosts have failed and the running result is not ok 33192 1726883090.71826: done checking to see if all hosts have failed 33192 1726883090.71827: getting the remaining hosts for this loop 33192 1726883090.71828: done getting the remaining hosts for this loop 33192 1726883090.71832: getting the next task for host managed_node1 33192 1726883090.71848: done getting next task for host managed_node1 33192 1726883090.71852: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 33192 1726883090.71855: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883090.71865: getting variables 33192 1726883090.71866: in VariableManager get_vars() 33192 1726883090.71899: Calling all_inventory to load vars for managed_node1 33192 1726883090.71901: Calling groups_inventory to load vars for managed_node1 33192 1726883090.71903: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883090.71909: Calling all_plugins_play to load vars for managed_node1 33192 1726883090.71911: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883090.71914: Calling groups_plugins_play to load vars for managed_node1 33192 1726883090.72064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883090.72250: done with get_vars() 33192 1726883090.72260: done getting variables 33192 1726883090.72305: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:44:50 -0400 (0:00:00.016) 0:00:04.123 ****** 33192 1726883090.72327: entering _queue_task() for managed_node1/debug 33192 1726883090.72493: worker is 1 (out of 1 available) 33192 1726883090.72507: exiting _queue_task() for managed_node1/debug 33192 1726883090.72518: done queuing things up, now waiting for results queue to drain 33192 1726883090.72520: waiting for pending results... 33192 1726883090.72681: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 33192 1726883090.72781: in run() - task 0affe814-3a2d-6c15-6a7e-000000000018 33192 1726883090.72794: variable 'ansible_search_path' from source: unknown 33192 1726883090.72798: variable 'ansible_search_path' from source: unknown 33192 1726883090.72828: calling self._execute() 33192 1726883090.72902: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883090.72908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883090.72918: variable 'omit' from source: magic vars 33192 1726883090.73222: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.73232: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883090.73328: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.73336: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883090.73339: when evaluation is False, skipping this task 33192 1726883090.73343: _execute() done 33192 1726883090.73348: dumping result to json 33192 1726883090.73353: done dumping result, returning 33192 1726883090.73360: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0affe814-3a2d-6c15-6a7e-000000000018] 33192 1726883090.73366: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000018 33192 1726883090.73456: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000018 33192 1726883090.73460: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 33192 1726883090.73507: no more pending results, returning what we have 33192 1726883090.73510: results queue empty 33192 1726883090.73511: checking for any_errors_fatal 33192 1726883090.73516: done checking for any_errors_fatal 33192 1726883090.73517: checking for max_fail_percentage 33192 1726883090.73519: done checking for max_fail_percentage 33192 1726883090.73520: checking to see if all hosts have failed and the running result is not ok 33192 1726883090.73521: done checking to see if all hosts have failed 33192 1726883090.73522: getting the remaining hosts for this loop 33192 1726883090.73523: done getting the remaining hosts for this loop 33192 1726883090.73527: getting the next task for host managed_node1 33192 1726883090.73532: done getting next task for host managed_node1 33192 1726883090.73538: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 33192 1726883090.73541: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883090.73556: getting variables 33192 1726883090.73558: in VariableManager get_vars() 33192 1726883090.73597: Calling all_inventory to load vars for managed_node1 33192 1726883090.73599: Calling groups_inventory to load vars for managed_node1 33192 1726883090.73601: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883090.73608: Calling all_plugins_play to load vars for managed_node1 33192 1726883090.73610: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883090.73612: Calling groups_plugins_play to load vars for managed_node1 33192 1726883090.73786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883090.73973: done with get_vars() 33192 1726883090.73981: done getting variables 33192 1726883090.74051: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:44:50 -0400 (0:00:00.017) 0:00:04.141 ****** 33192 1726883090.74074: entering _queue_task() for managed_node1/fail 33192 1726883090.74076: Creating lock for fail 33192 1726883090.74243: worker is 1 (out of 1 available) 33192 1726883090.74257: exiting _queue_task() for managed_node1/fail 33192 1726883090.74268: done queuing things up, now waiting for results queue to drain 33192 1726883090.74269: waiting for pending results... 33192 1726883090.74416: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 33192 1726883090.74513: in run() - task 0affe814-3a2d-6c15-6a7e-000000000019 33192 1726883090.74525: variable 'ansible_search_path' from source: unknown 33192 1726883090.74528: variable 'ansible_search_path' from source: unknown 33192 1726883090.74559: calling self._execute() 33192 1726883090.74620: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883090.74627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883090.74637: variable 'omit' from source: magic vars 33192 1726883090.74915: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.74924: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883090.75022: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.75027: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883090.75031: when evaluation is False, skipping this task 33192 1726883090.75036: _execute() done 33192 1726883090.75044: dumping result to json 33192 1726883090.75049: done dumping result, returning 33192 1726883090.75058: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affe814-3a2d-6c15-6a7e-000000000019] 33192 1726883090.75061: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000019 33192 1726883090.75154: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000019 33192 1726883090.75159: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883090.75206: no more pending results, returning what we have 33192 1726883090.75209: results queue empty 33192 1726883090.75210: checking for any_errors_fatal 33192 1726883090.75213: done checking for any_errors_fatal 33192 1726883090.75214: checking for max_fail_percentage 33192 1726883090.75216: done checking for max_fail_percentage 33192 1726883090.75217: checking to see if all hosts have failed and the running result is not ok 33192 1726883090.75218: done checking to see if all hosts have failed 33192 1726883090.75219: getting the remaining hosts for this loop 33192 1726883090.75221: done getting the remaining hosts for this loop 33192 1726883090.75224: getting the next task for host managed_node1 33192 1726883090.75229: done getting next task for host managed_node1 33192 1726883090.75233: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 33192 1726883090.75238: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883090.75251: getting variables 33192 1726883090.75253: in VariableManager get_vars() 33192 1726883090.75291: Calling all_inventory to load vars for managed_node1 33192 1726883090.75293: Calling groups_inventory to load vars for managed_node1 33192 1726883090.75295: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883090.75301: Calling all_plugins_play to load vars for managed_node1 33192 1726883090.75303: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883090.75306: Calling groups_plugins_play to load vars for managed_node1 33192 1726883090.75453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883090.75644: done with get_vars() 33192 1726883090.75651: done getting variables 33192 1726883090.75696: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:44:50 -0400 (0:00:00.016) 0:00:04.157 ****** 33192 1726883090.75721: entering _queue_task() for managed_node1/fail 33192 1726883090.75885: worker is 1 (out of 1 available) 33192 1726883090.75897: exiting _queue_task() for managed_node1/fail 33192 1726883090.75909: done queuing things up, now waiting for results queue to drain 33192 1726883090.75911: waiting for pending results... 33192 1726883090.76058: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 33192 1726883090.76143: in run() - task 0affe814-3a2d-6c15-6a7e-00000000001a 33192 1726883090.76154: variable 'ansible_search_path' from source: unknown 33192 1726883090.76158: variable 'ansible_search_path' from source: unknown 33192 1726883090.76189: calling self._execute() 33192 1726883090.76247: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883090.76254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883090.76264: variable 'omit' from source: magic vars 33192 1726883090.76550: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.76560: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883090.76715: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.76721: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883090.76725: when evaluation is False, skipping this task 33192 1726883090.76730: _execute() done 33192 1726883090.76733: dumping result to json 33192 1726883090.76739: done dumping result, returning 33192 1726883090.76747: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affe814-3a2d-6c15-6a7e-00000000001a] 33192 1726883090.76752: sending task result for task 0affe814-3a2d-6c15-6a7e-00000000001a 33192 1726883090.76845: done sending task result for task 0affe814-3a2d-6c15-6a7e-00000000001a 33192 1726883090.76848: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883090.76894: no more pending results, returning what we have 33192 1726883090.76897: results queue empty 33192 1726883090.76898: checking for any_errors_fatal 33192 1726883090.76904: done checking for any_errors_fatal 33192 1726883090.76905: checking for max_fail_percentage 33192 1726883090.76907: done checking for max_fail_percentage 33192 1726883090.76908: checking to see if all hosts have failed and the running result is not ok 33192 1726883090.76909: done checking to see if all hosts have failed 33192 1726883090.76910: getting the remaining hosts for this loop 33192 1726883090.76911: done getting the remaining hosts for this loop 33192 1726883090.76915: getting the next task for host managed_node1 33192 1726883090.76920: done getting next task for host managed_node1 33192 1726883090.76924: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 33192 1726883090.76928: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883090.76943: getting variables 33192 1726883090.76945: in VariableManager get_vars() 33192 1726883090.76980: Calling all_inventory to load vars for managed_node1 33192 1726883090.76982: Calling groups_inventory to load vars for managed_node1 33192 1726883090.76984: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883090.76990: Calling all_plugins_play to load vars for managed_node1 33192 1726883090.76992: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883090.76994: Calling groups_plugins_play to load vars for managed_node1 33192 1726883090.77175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883090.77357: done with get_vars() 33192 1726883090.77365: done getting variables 33192 1726883090.77412: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:44:50 -0400 (0:00:00.017) 0:00:04.174 ****** 33192 1726883090.77435: entering _queue_task() for managed_node1/fail 33192 1726883090.77594: worker is 1 (out of 1 available) 33192 1726883090.77606: exiting _queue_task() for managed_node1/fail 33192 1726883090.77616: done queuing things up, now waiting for results queue to drain 33192 1726883090.77617: waiting for pending results... 33192 1726883090.77762: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 33192 1726883090.77842: in run() - task 0affe814-3a2d-6c15-6a7e-00000000001b 33192 1726883090.77857: variable 'ansible_search_path' from source: unknown 33192 1726883090.77861: variable 'ansible_search_path' from source: unknown 33192 1726883090.77886: calling self._execute() 33192 1726883090.77941: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883090.77951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883090.77964: variable 'omit' from source: magic vars 33192 1726883090.78226: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.78237: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883090.78337: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.78343: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883090.78346: when evaluation is False, skipping this task 33192 1726883090.78349: _execute() done 33192 1726883090.78354: dumping result to json 33192 1726883090.78359: done dumping result, returning 33192 1726883090.78367: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affe814-3a2d-6c15-6a7e-00000000001b] 33192 1726883090.78374: sending task result for task 0affe814-3a2d-6c15-6a7e-00000000001b 33192 1726883090.78468: done sending task result for task 0affe814-3a2d-6c15-6a7e-00000000001b 33192 1726883090.78474: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883090.78530: no more pending results, returning what we have 33192 1726883090.78541: results queue empty 33192 1726883090.78543: checking for any_errors_fatal 33192 1726883090.78548: done checking for any_errors_fatal 33192 1726883090.78549: checking for max_fail_percentage 33192 1726883090.78551: done checking for max_fail_percentage 33192 1726883090.78552: checking to see if all hosts have failed and the running result is not ok 33192 1726883090.78553: done checking to see if all hosts have failed 33192 1726883090.78554: getting the remaining hosts for this loop 33192 1726883090.78555: done getting the remaining hosts for this loop 33192 1726883090.78558: getting the next task for host managed_node1 33192 1726883090.78562: done getting next task for host managed_node1 33192 1726883090.78564: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 33192 1726883090.78566: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883090.78578: getting variables 33192 1726883090.78579: in VariableManager get_vars() 33192 1726883090.78613: Calling all_inventory to load vars for managed_node1 33192 1726883090.78615: Calling groups_inventory to load vars for managed_node1 33192 1726883090.78617: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883090.78623: Calling all_plugins_play to load vars for managed_node1 33192 1726883090.78625: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883090.78627: Calling groups_plugins_play to load vars for managed_node1 33192 1726883090.78784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883090.78993: done with get_vars() 33192 1726883090.79000: done getting variables 33192 1726883090.79074: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:44:50 -0400 (0:00:00.016) 0:00:04.191 ****** 33192 1726883090.79097: entering _queue_task() for managed_node1/dnf 33192 1726883090.79259: worker is 1 (out of 1 available) 33192 1726883090.79274: exiting _queue_task() for managed_node1/dnf 33192 1726883090.79287: done queuing things up, now waiting for results queue to drain 33192 1726883090.79288: waiting for pending results... 33192 1726883090.79423: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 33192 1726883090.79505: in run() - task 0affe814-3a2d-6c15-6a7e-00000000001c 33192 1726883090.79518: variable 'ansible_search_path' from source: unknown 33192 1726883090.79522: variable 'ansible_search_path' from source: unknown 33192 1726883090.79553: calling self._execute() 33192 1726883090.79609: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883090.79617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883090.79628: variable 'omit' from source: magic vars 33192 1726883090.79899: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.79909: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883090.80007: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.80011: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883090.80015: when evaluation is False, skipping this task 33192 1726883090.80019: _execute() done 33192 1726883090.80023: dumping result to json 33192 1726883090.80028: done dumping result, returning 33192 1726883090.80038: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affe814-3a2d-6c15-6a7e-00000000001c] 33192 1726883090.80043: sending task result for task 0affe814-3a2d-6c15-6a7e-00000000001c 33192 1726883090.80138: done sending task result for task 0affe814-3a2d-6c15-6a7e-00000000001c 33192 1726883090.80142: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883090.80207: no more pending results, returning what we have 33192 1726883090.80210: results queue empty 33192 1726883090.80211: checking for any_errors_fatal 33192 1726883090.80217: done checking for any_errors_fatal 33192 1726883090.80218: checking for max_fail_percentage 33192 1726883090.80220: done checking for max_fail_percentage 33192 1726883090.80221: checking to see if all hosts have failed and the running result is not ok 33192 1726883090.80222: done checking to see if all hosts have failed 33192 1726883090.80222: getting the remaining hosts for this loop 33192 1726883090.80224: done getting the remaining hosts for this loop 33192 1726883090.80227: getting the next task for host managed_node1 33192 1726883090.80230: done getting next task for host managed_node1 33192 1726883090.80235: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 33192 1726883090.80238: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883090.80247: getting variables 33192 1726883090.80248: in VariableManager get_vars() 33192 1726883090.80284: Calling all_inventory to load vars for managed_node1 33192 1726883090.80286: Calling groups_inventory to load vars for managed_node1 33192 1726883090.80288: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883090.80294: Calling all_plugins_play to load vars for managed_node1 33192 1726883090.80296: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883090.80298: Calling groups_plugins_play to load vars for managed_node1 33192 1726883090.80449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883090.80643: done with get_vars() 33192 1726883090.80650: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 33192 1726883090.80709: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:44:50 -0400 (0:00:00.016) 0:00:04.207 ****** 33192 1726883090.80731: entering _queue_task() for managed_node1/yum 33192 1726883090.80732: Creating lock for yum 33192 1726883090.80909: worker is 1 (out of 1 available) 33192 1726883090.80922: exiting _queue_task() for managed_node1/yum 33192 1726883090.80932: done queuing things up, now waiting for results queue to drain 33192 1726883090.80935: waiting for pending results... 33192 1726883090.81081: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 33192 1726883090.81156: in run() - task 0affe814-3a2d-6c15-6a7e-00000000001d 33192 1726883090.81170: variable 'ansible_search_path' from source: unknown 33192 1726883090.81176: variable 'ansible_search_path' from source: unknown 33192 1726883090.81202: calling self._execute() 33192 1726883090.81262: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883090.81276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883090.81280: variable 'omit' from source: magic vars 33192 1726883090.81552: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.81562: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883090.81659: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.81665: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883090.81668: when evaluation is False, skipping this task 33192 1726883090.81675: _execute() done 33192 1726883090.81678: dumping result to json 33192 1726883090.81682: done dumping result, returning 33192 1726883090.81689: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affe814-3a2d-6c15-6a7e-00000000001d] 33192 1726883090.81694: sending task result for task 0affe814-3a2d-6c15-6a7e-00000000001d 33192 1726883090.81790: done sending task result for task 0affe814-3a2d-6c15-6a7e-00000000001d 33192 1726883090.81794: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883090.81860: no more pending results, returning what we have 33192 1726883090.81863: results queue empty 33192 1726883090.81864: checking for any_errors_fatal 33192 1726883090.81870: done checking for any_errors_fatal 33192 1726883090.81873: checking for max_fail_percentage 33192 1726883090.81875: done checking for max_fail_percentage 33192 1726883090.81876: checking to see if all hosts have failed and the running result is not ok 33192 1726883090.81876: done checking to see if all hosts have failed 33192 1726883090.81877: getting the remaining hosts for this loop 33192 1726883090.81878: done getting the remaining hosts for this loop 33192 1726883090.81881: getting the next task for host managed_node1 33192 1726883090.81885: done getting next task for host managed_node1 33192 1726883090.81887: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 33192 1726883090.81889: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883090.81899: getting variables 33192 1726883090.81900: in VariableManager get_vars() 33192 1726883090.81938: Calling all_inventory to load vars for managed_node1 33192 1726883090.81940: Calling groups_inventory to load vars for managed_node1 33192 1726883090.81942: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883090.81948: Calling all_plugins_play to load vars for managed_node1 33192 1726883090.81950: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883090.81952: Calling groups_plugins_play to load vars for managed_node1 33192 1726883090.82129: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883090.82313: done with get_vars() 33192 1726883090.82320: done getting variables 33192 1726883090.82367: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:44:50 -0400 (0:00:00.016) 0:00:04.224 ****** 33192 1726883090.82392: entering _queue_task() for managed_node1/fail 33192 1726883090.82560: worker is 1 (out of 1 available) 33192 1726883090.82575: exiting _queue_task() for managed_node1/fail 33192 1726883090.82588: done queuing things up, now waiting for results queue to drain 33192 1726883090.82589: waiting for pending results... 33192 1726883090.82726: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 33192 1726883090.82811: in run() - task 0affe814-3a2d-6c15-6a7e-00000000001e 33192 1726883090.82821: variable 'ansible_search_path' from source: unknown 33192 1726883090.82824: variable 'ansible_search_path' from source: unknown 33192 1726883090.82859: calling self._execute() 33192 1726883090.82918: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883090.82930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883090.82933: variable 'omit' from source: magic vars 33192 1726883090.83211: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.83221: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883090.83320: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.83324: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883090.83329: when evaluation is False, skipping this task 33192 1726883090.83332: _execute() done 33192 1726883090.83360: dumping result to json 33192 1726883090.83365: done dumping result, returning 33192 1726883090.83369: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-6c15-6a7e-00000000001e] 33192 1726883090.83374: sending task result for task 0affe814-3a2d-6c15-6a7e-00000000001e 33192 1726883090.83448: done sending task result for task 0affe814-3a2d-6c15-6a7e-00000000001e 33192 1726883090.83451: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883090.83502: no more pending results, returning what we have 33192 1726883090.83505: results queue empty 33192 1726883090.83507: checking for any_errors_fatal 33192 1726883090.83511: done checking for any_errors_fatal 33192 1726883090.83512: checking for max_fail_percentage 33192 1726883090.83514: done checking for max_fail_percentage 33192 1726883090.83515: checking to see if all hosts have failed and the running result is not ok 33192 1726883090.83516: done checking to see if all hosts have failed 33192 1726883090.83517: getting the remaining hosts for this loop 33192 1726883090.83518: done getting the remaining hosts for this loop 33192 1726883090.83522: getting the next task for host managed_node1 33192 1726883090.83528: done getting next task for host managed_node1 33192 1726883090.83531: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 33192 1726883090.83536: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883090.83550: getting variables 33192 1726883090.83552: in VariableManager get_vars() 33192 1726883090.83584: Calling all_inventory to load vars for managed_node1 33192 1726883090.83586: Calling groups_inventory to load vars for managed_node1 33192 1726883090.83588: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883090.83594: Calling all_plugins_play to load vars for managed_node1 33192 1726883090.83596: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883090.83599: Calling groups_plugins_play to load vars for managed_node1 33192 1726883090.83747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883090.83936: done with get_vars() 33192 1726883090.83944: done getting variables 33192 1726883090.83992: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:44:50 -0400 (0:00:00.016) 0:00:04.240 ****** 33192 1726883090.84014: entering _queue_task() for managed_node1/package 33192 1726883090.84187: worker is 1 (out of 1 available) 33192 1726883090.84201: exiting _queue_task() for managed_node1/package 33192 1726883090.84212: done queuing things up, now waiting for results queue to drain 33192 1726883090.84213: waiting for pending results... 33192 1726883090.84350: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 33192 1726883090.84433: in run() - task 0affe814-3a2d-6c15-6a7e-00000000001f 33192 1726883090.84447: variable 'ansible_search_path' from source: unknown 33192 1726883090.84451: variable 'ansible_search_path' from source: unknown 33192 1726883090.84482: calling self._execute() 33192 1726883090.84538: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883090.84548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883090.84583: variable 'omit' from source: magic vars 33192 1726883090.84878: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.84888: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883090.84979: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.84984: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883090.84991: when evaluation is False, skipping this task 33192 1726883090.84995: _execute() done 33192 1726883090.84998: dumping result to json 33192 1726883090.85000: done dumping result, returning 33192 1726883090.85010: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0affe814-3a2d-6c15-6a7e-00000000001f] 33192 1726883090.85014: sending task result for task 0affe814-3a2d-6c15-6a7e-00000000001f 33192 1726883090.85112: done sending task result for task 0affe814-3a2d-6c15-6a7e-00000000001f 33192 1726883090.85115: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883090.85166: no more pending results, returning what we have 33192 1726883090.85170: results queue empty 33192 1726883090.85173: checking for any_errors_fatal 33192 1726883090.85178: done checking for any_errors_fatal 33192 1726883090.85179: checking for max_fail_percentage 33192 1726883090.85181: done checking for max_fail_percentage 33192 1726883090.85182: checking to see if all hosts have failed and the running result is not ok 33192 1726883090.85183: done checking to see if all hosts have failed 33192 1726883090.85184: getting the remaining hosts for this loop 33192 1726883090.85185: done getting the remaining hosts for this loop 33192 1726883090.85189: getting the next task for host managed_node1 33192 1726883090.85194: done getting next task for host managed_node1 33192 1726883090.85198: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 33192 1726883090.85201: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883090.85214: getting variables 33192 1726883090.85215: in VariableManager get_vars() 33192 1726883090.85249: Calling all_inventory to load vars for managed_node1 33192 1726883090.85252: Calling groups_inventory to load vars for managed_node1 33192 1726883090.85253: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883090.85260: Calling all_plugins_play to load vars for managed_node1 33192 1726883090.85262: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883090.85264: Calling groups_plugins_play to load vars for managed_node1 33192 1726883090.85442: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883090.85626: done with get_vars() 33192 1726883090.85633: done getting variables 33192 1726883090.85684: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:44:50 -0400 (0:00:00.016) 0:00:04.257 ****** 33192 1726883090.85707: entering _queue_task() for managed_node1/package 33192 1726883090.85878: worker is 1 (out of 1 available) 33192 1726883090.85892: exiting _queue_task() for managed_node1/package 33192 1726883090.85903: done queuing things up, now waiting for results queue to drain 33192 1726883090.85904: waiting for pending results... 33192 1726883090.86040: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 33192 1726883090.86126: in run() - task 0affe814-3a2d-6c15-6a7e-000000000020 33192 1726883090.86141: variable 'ansible_search_path' from source: unknown 33192 1726883090.86145: variable 'ansible_search_path' from source: unknown 33192 1726883090.86176: calling self._execute() 33192 1726883090.86227: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883090.86233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883090.86246: variable 'omit' from source: magic vars 33192 1726883090.86511: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.86521: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883090.86618: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.86624: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883090.86627: when evaluation is False, skipping this task 33192 1726883090.86631: _execute() done 33192 1726883090.86637: dumping result to json 33192 1726883090.86642: done dumping result, returning 33192 1726883090.86650: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affe814-3a2d-6c15-6a7e-000000000020] 33192 1726883090.86655: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000020 33192 1726883090.86762: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000020 33192 1726883090.86766: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883090.86821: no more pending results, returning what we have 33192 1726883090.86824: results queue empty 33192 1726883090.86825: checking for any_errors_fatal 33192 1726883090.86830: done checking for any_errors_fatal 33192 1726883090.86831: checking for max_fail_percentage 33192 1726883090.86833: done checking for max_fail_percentage 33192 1726883090.86836: checking to see if all hosts have failed and the running result is not ok 33192 1726883090.86837: done checking to see if all hosts have failed 33192 1726883090.86838: getting the remaining hosts for this loop 33192 1726883090.86839: done getting the remaining hosts for this loop 33192 1726883090.86842: getting the next task for host managed_node1 33192 1726883090.86851: done getting next task for host managed_node1 33192 1726883090.86854: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 33192 1726883090.86856: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883090.86866: getting variables 33192 1726883090.86867: in VariableManager get_vars() 33192 1726883090.86899: Calling all_inventory to load vars for managed_node1 33192 1726883090.86900: Calling groups_inventory to load vars for managed_node1 33192 1726883090.86902: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883090.86908: Calling all_plugins_play to load vars for managed_node1 33192 1726883090.86910: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883090.86912: Calling groups_plugins_play to load vars for managed_node1 33192 1726883090.87059: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883090.87265: done with get_vars() 33192 1726883090.87276: done getting variables 33192 1726883090.87320: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:44:50 -0400 (0:00:00.016) 0:00:04.273 ****** 33192 1726883090.87344: entering _queue_task() for managed_node1/package 33192 1726883090.87512: worker is 1 (out of 1 available) 33192 1726883090.87527: exiting _queue_task() for managed_node1/package 33192 1726883090.87540: done queuing things up, now waiting for results queue to drain 33192 1726883090.87542: waiting for pending results... 33192 1726883090.87679: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 33192 1726883090.87754: in run() - task 0affe814-3a2d-6c15-6a7e-000000000021 33192 1726883090.87770: variable 'ansible_search_path' from source: unknown 33192 1726883090.87777: variable 'ansible_search_path' from source: unknown 33192 1726883090.87798: calling self._execute() 33192 1726883090.87865: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883090.87873: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883090.87887: variable 'omit' from source: magic vars 33192 1726883090.88184: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.88194: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883090.88294: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.88299: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883090.88303: when evaluation is False, skipping this task 33192 1726883090.88308: _execute() done 33192 1726883090.88312: dumping result to json 33192 1726883090.88315: done dumping result, returning 33192 1726883090.88325: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affe814-3a2d-6c15-6a7e-000000000021] 33192 1726883090.88330: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000021 33192 1726883090.88427: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000021 33192 1726883090.88431: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883090.88478: no more pending results, returning what we have 33192 1726883090.88482: results queue empty 33192 1726883090.88483: checking for any_errors_fatal 33192 1726883090.88492: done checking for any_errors_fatal 33192 1726883090.88493: checking for max_fail_percentage 33192 1726883090.88494: done checking for max_fail_percentage 33192 1726883090.88495: checking to see if all hosts have failed and the running result is not ok 33192 1726883090.88497: done checking to see if all hosts have failed 33192 1726883090.88498: getting the remaining hosts for this loop 33192 1726883090.88499: done getting the remaining hosts for this loop 33192 1726883090.88502: getting the next task for host managed_node1 33192 1726883090.88508: done getting next task for host managed_node1 33192 1726883090.88511: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 33192 1726883090.88514: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883090.88527: getting variables 33192 1726883090.88529: in VariableManager get_vars() 33192 1726883090.88568: Calling all_inventory to load vars for managed_node1 33192 1726883090.88570: Calling groups_inventory to load vars for managed_node1 33192 1726883090.88574: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883090.88580: Calling all_plugins_play to load vars for managed_node1 33192 1726883090.88582: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883090.88584: Calling groups_plugins_play to load vars for managed_node1 33192 1726883090.88730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883090.88919: done with get_vars() 33192 1726883090.88927: done getting variables 33192 1726883090.89004: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:44:50 -0400 (0:00:00.016) 0:00:04.290 ****** 33192 1726883090.89026: entering _queue_task() for managed_node1/service 33192 1726883090.89028: Creating lock for service 33192 1726883090.89201: worker is 1 (out of 1 available) 33192 1726883090.89215: exiting _queue_task() for managed_node1/service 33192 1726883090.89227: done queuing things up, now waiting for results queue to drain 33192 1726883090.89228: waiting for pending results... 33192 1726883090.89381: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 33192 1726883090.89469: in run() - task 0affe814-3a2d-6c15-6a7e-000000000022 33192 1726883090.89480: variable 'ansible_search_path' from source: unknown 33192 1726883090.89484: variable 'ansible_search_path' from source: unknown 33192 1726883090.89512: calling self._execute() 33192 1726883090.89579: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883090.89583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883090.89592: variable 'omit' from source: magic vars 33192 1726883090.89867: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.89878: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883090.89977: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.89981: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883090.89984: when evaluation is False, skipping this task 33192 1726883090.89987: _execute() done 33192 1726883090.89993: dumping result to json 33192 1726883090.89996: done dumping result, returning 33192 1726883090.90011: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-6c15-6a7e-000000000022] 33192 1726883090.90016: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000022 33192 1726883090.90106: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000022 33192 1726883090.90110: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883090.90164: no more pending results, returning what we have 33192 1726883090.90167: results queue empty 33192 1726883090.90168: checking for any_errors_fatal 33192 1726883090.90176: done checking for any_errors_fatal 33192 1726883090.90177: checking for max_fail_percentage 33192 1726883090.90179: done checking for max_fail_percentage 33192 1726883090.90180: checking to see if all hosts have failed and the running result is not ok 33192 1726883090.90181: done checking to see if all hosts have failed 33192 1726883090.90182: getting the remaining hosts for this loop 33192 1726883090.90183: done getting the remaining hosts for this loop 33192 1726883090.90187: getting the next task for host managed_node1 33192 1726883090.90193: done getting next task for host managed_node1 33192 1726883090.90196: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 33192 1726883090.90199: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883090.90212: getting variables 33192 1726883090.90213: in VariableManager get_vars() 33192 1726883090.90249: Calling all_inventory to load vars for managed_node1 33192 1726883090.90252: Calling groups_inventory to load vars for managed_node1 33192 1726883090.90253: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883090.90261: Calling all_plugins_play to load vars for managed_node1 33192 1726883090.90263: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883090.90266: Calling groups_plugins_play to load vars for managed_node1 33192 1726883090.90441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883090.90629: done with get_vars() 33192 1726883090.90639: done getting variables 33192 1726883090.90686: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:44:50 -0400 (0:00:00.016) 0:00:04.307 ****** 33192 1726883090.90707: entering _queue_task() for managed_node1/service 33192 1726883090.90874: worker is 1 (out of 1 available) 33192 1726883090.90887: exiting _queue_task() for managed_node1/service 33192 1726883090.90899: done queuing things up, now waiting for results queue to drain 33192 1726883090.90900: waiting for pending results... 33192 1726883090.91045: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 33192 1726883090.91118: in run() - task 0affe814-3a2d-6c15-6a7e-000000000023 33192 1726883090.91140: variable 'ansible_search_path' from source: unknown 33192 1726883090.91148: variable 'ansible_search_path' from source: unknown 33192 1726883090.91167: calling self._execute() 33192 1726883090.91225: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883090.91233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883090.91250: variable 'omit' from source: magic vars 33192 1726883090.91507: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.91517: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883090.91614: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.91619: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883090.91622: when evaluation is False, skipping this task 33192 1726883090.91627: _execute() done 33192 1726883090.91631: dumping result to json 33192 1726883090.91637: done dumping result, returning 33192 1726883090.91644: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affe814-3a2d-6c15-6a7e-000000000023] 33192 1726883090.91649: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000023 33192 1726883090.91740: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000023 33192 1726883090.91744: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 33192 1726883090.91788: no more pending results, returning what we have 33192 1726883090.91792: results queue empty 33192 1726883090.91793: checking for any_errors_fatal 33192 1726883090.91798: done checking for any_errors_fatal 33192 1726883090.91799: checking for max_fail_percentage 33192 1726883090.91801: done checking for max_fail_percentage 33192 1726883090.91802: checking to see if all hosts have failed and the running result is not ok 33192 1726883090.91803: done checking to see if all hosts have failed 33192 1726883090.91804: getting the remaining hosts for this loop 33192 1726883090.91805: done getting the remaining hosts for this loop 33192 1726883090.91809: getting the next task for host managed_node1 33192 1726883090.91814: done getting next task for host managed_node1 33192 1726883090.91818: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 33192 1726883090.91821: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883090.91840: getting variables 33192 1726883090.91841: in VariableManager get_vars() 33192 1726883090.91874: Calling all_inventory to load vars for managed_node1 33192 1726883090.91876: Calling groups_inventory to load vars for managed_node1 33192 1726883090.91878: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883090.91884: Calling all_plugins_play to load vars for managed_node1 33192 1726883090.91886: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883090.91889: Calling groups_plugins_play to load vars for managed_node1 33192 1726883090.92033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883090.92223: done with get_vars() 33192 1726883090.92231: done getting variables 33192 1726883090.92275: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:44:50 -0400 (0:00:00.015) 0:00:04.323 ****** 33192 1726883090.92300: entering _queue_task() for managed_node1/service 33192 1726883090.92460: worker is 1 (out of 1 available) 33192 1726883090.92471: exiting _queue_task() for managed_node1/service 33192 1726883090.92483: done queuing things up, now waiting for results queue to drain 33192 1726883090.92484: waiting for pending results... 33192 1726883090.92643: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 33192 1726883090.92734: in run() - task 0affe814-3a2d-6c15-6a7e-000000000024 33192 1726883090.92748: variable 'ansible_search_path' from source: unknown 33192 1726883090.92751: variable 'ansible_search_path' from source: unknown 33192 1726883090.92783: calling self._execute() 33192 1726883090.92848: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883090.92854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883090.92865: variable 'omit' from source: magic vars 33192 1726883090.93206: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.93216: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883090.93317: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.93321: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883090.93326: when evaluation is False, skipping this task 33192 1726883090.93329: _execute() done 33192 1726883090.93336: dumping result to json 33192 1726883090.93340: done dumping result, returning 33192 1726883090.93348: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affe814-3a2d-6c15-6a7e-000000000024] 33192 1726883090.93353: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000024 33192 1726883090.93445: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000024 33192 1726883090.93448: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883090.93515: no more pending results, returning what we have 33192 1726883090.93518: results queue empty 33192 1726883090.93519: checking for any_errors_fatal 33192 1726883090.93524: done checking for any_errors_fatal 33192 1726883090.93525: checking for max_fail_percentage 33192 1726883090.93527: done checking for max_fail_percentage 33192 1726883090.93528: checking to see if all hosts have failed and the running result is not ok 33192 1726883090.93529: done checking to see if all hosts have failed 33192 1726883090.93530: getting the remaining hosts for this loop 33192 1726883090.93531: done getting the remaining hosts for this loop 33192 1726883090.93537: getting the next task for host managed_node1 33192 1726883090.93542: done getting next task for host managed_node1 33192 1726883090.93546: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 33192 1726883090.93549: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883090.93561: getting variables 33192 1726883090.93563: in VariableManager get_vars() 33192 1726883090.93595: Calling all_inventory to load vars for managed_node1 33192 1726883090.93597: Calling groups_inventory to load vars for managed_node1 33192 1726883090.93599: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883090.93605: Calling all_plugins_play to load vars for managed_node1 33192 1726883090.93607: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883090.93609: Calling groups_plugins_play to load vars for managed_node1 33192 1726883090.93788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883090.93969: done with get_vars() 33192 1726883090.93977: done getting variables 33192 1726883090.94022: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:44:50 -0400 (0:00:00.017) 0:00:04.340 ****** 33192 1726883090.94045: entering _queue_task() for managed_node1/service 33192 1726883090.94207: worker is 1 (out of 1 available) 33192 1726883090.94220: exiting _queue_task() for managed_node1/service 33192 1726883090.94230: done queuing things up, now waiting for results queue to drain 33192 1726883090.94231: waiting for pending results... 33192 1726883090.94376: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 33192 1726883090.94470: in run() - task 0affe814-3a2d-6c15-6a7e-000000000025 33192 1726883090.94474: variable 'ansible_search_path' from source: unknown 33192 1726883090.94539: variable 'ansible_search_path' from source: unknown 33192 1726883090.94544: calling self._execute() 33192 1726883090.94567: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883090.94582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883090.94590: variable 'omit' from source: magic vars 33192 1726883090.94858: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.94868: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883090.94966: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.94973: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883090.94977: when evaluation is False, skipping this task 33192 1726883090.94980: _execute() done 33192 1726883090.94982: dumping result to json 33192 1726883090.94988: done dumping result, returning 33192 1726883090.94995: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0affe814-3a2d-6c15-6a7e-000000000025] 33192 1726883090.95001: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000025 33192 1726883090.95092: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000025 33192 1726883090.95095: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 33192 1726883090.95161: no more pending results, returning what we have 33192 1726883090.95165: results queue empty 33192 1726883090.95166: checking for any_errors_fatal 33192 1726883090.95170: done checking for any_errors_fatal 33192 1726883090.95173: checking for max_fail_percentage 33192 1726883090.95175: done checking for max_fail_percentage 33192 1726883090.95176: checking to see if all hosts have failed and the running result is not ok 33192 1726883090.95177: done checking to see if all hosts have failed 33192 1726883090.95178: getting the remaining hosts for this loop 33192 1726883090.95179: done getting the remaining hosts for this loop 33192 1726883090.95182: getting the next task for host managed_node1 33192 1726883090.95186: done getting next task for host managed_node1 33192 1726883090.95189: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 33192 1726883090.95191: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883090.95201: getting variables 33192 1726883090.95202: in VariableManager get_vars() 33192 1726883090.95232: Calling all_inventory to load vars for managed_node1 33192 1726883090.95235: Calling groups_inventory to load vars for managed_node1 33192 1726883090.95237: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883090.95244: Calling all_plugins_play to load vars for managed_node1 33192 1726883090.95252: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883090.95256: Calling groups_plugins_play to load vars for managed_node1 33192 1726883090.95401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883090.95607: done with get_vars() 33192 1726883090.95614: done getting variables 33192 1726883090.95659: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:44:50 -0400 (0:00:00.016) 0:00:04.357 ****** 33192 1726883090.95682: entering _queue_task() for managed_node1/copy 33192 1726883090.95842: worker is 1 (out of 1 available) 33192 1726883090.95853: exiting _queue_task() for managed_node1/copy 33192 1726883090.95863: done queuing things up, now waiting for results queue to drain 33192 1726883090.95864: waiting for pending results... 33192 1726883090.96017: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 33192 1726883090.96106: in run() - task 0affe814-3a2d-6c15-6a7e-000000000026 33192 1726883090.96118: variable 'ansible_search_path' from source: unknown 33192 1726883090.96121: variable 'ansible_search_path' from source: unknown 33192 1726883090.96154: calling self._execute() 33192 1726883090.96218: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883090.96225: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883090.96236: variable 'omit' from source: magic vars 33192 1726883090.96525: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.96537: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883090.96633: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.96639: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883090.96642: when evaluation is False, skipping this task 33192 1726883090.96651: _execute() done 33192 1726883090.96654: dumping result to json 33192 1726883090.96656: done dumping result, returning 33192 1726883090.96666: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affe814-3a2d-6c15-6a7e-000000000026] 33192 1726883090.96669: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000026 33192 1726883090.96762: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000026 33192 1726883090.96766: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883090.96815: no more pending results, returning what we have 33192 1726883090.96818: results queue empty 33192 1726883090.96819: checking for any_errors_fatal 33192 1726883090.96825: done checking for any_errors_fatal 33192 1726883090.96826: checking for max_fail_percentage 33192 1726883090.96828: done checking for max_fail_percentage 33192 1726883090.96829: checking to see if all hosts have failed and the running result is not ok 33192 1726883090.96830: done checking to see if all hosts have failed 33192 1726883090.96831: getting the remaining hosts for this loop 33192 1726883090.96832: done getting the remaining hosts for this loop 33192 1726883090.96838: getting the next task for host managed_node1 33192 1726883090.96843: done getting next task for host managed_node1 33192 1726883090.96846: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 33192 1726883090.96849: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883090.96863: getting variables 33192 1726883090.96865: in VariableManager get_vars() 33192 1726883090.96904: Calling all_inventory to load vars for managed_node1 33192 1726883090.96906: Calling groups_inventory to load vars for managed_node1 33192 1726883090.96907: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883090.96914: Calling all_plugins_play to load vars for managed_node1 33192 1726883090.96916: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883090.96918: Calling groups_plugins_play to load vars for managed_node1 33192 1726883090.97064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883090.97252: done with get_vars() 33192 1726883090.97260: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:44:50 -0400 (0:00:00.016) 0:00:04.373 ****** 33192 1726883090.97323: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 33192 1726883090.97325: Creating lock for fedora.linux_system_roles.network_connections 33192 1726883090.97491: worker is 1 (out of 1 available) 33192 1726883090.97503: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 33192 1726883090.97513: done queuing things up, now waiting for results queue to drain 33192 1726883090.97514: waiting for pending results... 33192 1726883090.97664: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 33192 1726883090.97737: in run() - task 0affe814-3a2d-6c15-6a7e-000000000027 33192 1726883090.97754: variable 'ansible_search_path' from source: unknown 33192 1726883090.97759: variable 'ansible_search_path' from source: unknown 33192 1726883090.97785: calling self._execute() 33192 1726883090.97843: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883090.97853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883090.97866: variable 'omit' from source: magic vars 33192 1726883090.98130: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.98140: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883090.98238: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.98243: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883090.98246: when evaluation is False, skipping this task 33192 1726883090.98252: _execute() done 33192 1726883090.98254: dumping result to json 33192 1726883090.98260: done dumping result, returning 33192 1726883090.98268: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affe814-3a2d-6c15-6a7e-000000000027] 33192 1726883090.98273: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000027 33192 1726883090.98376: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000027 33192 1726883090.98379: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883090.98437: no more pending results, returning what we have 33192 1726883090.98441: results queue empty 33192 1726883090.98442: checking for any_errors_fatal 33192 1726883090.98456: done checking for any_errors_fatal 33192 1726883090.98457: checking for max_fail_percentage 33192 1726883090.98459: done checking for max_fail_percentage 33192 1726883090.98460: checking to see if all hosts have failed and the running result is not ok 33192 1726883090.98461: done checking to see if all hosts have failed 33192 1726883090.98462: getting the remaining hosts for this loop 33192 1726883090.98463: done getting the remaining hosts for this loop 33192 1726883090.98467: getting the next task for host managed_node1 33192 1726883090.98471: done getting next task for host managed_node1 33192 1726883090.98474: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 33192 1726883090.98476: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883090.98486: getting variables 33192 1726883090.98487: in VariableManager get_vars() 33192 1726883090.98517: Calling all_inventory to load vars for managed_node1 33192 1726883090.98519: Calling groups_inventory to load vars for managed_node1 33192 1726883090.98521: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883090.98527: Calling all_plugins_play to load vars for managed_node1 33192 1726883090.98529: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883090.98531: Calling groups_plugins_play to load vars for managed_node1 33192 1726883090.98711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883090.98900: done with get_vars() 33192 1726883090.98907: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:44:50 -0400 (0:00:00.016) 0:00:04.390 ****** 33192 1726883090.98968: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 33192 1726883090.98969: Creating lock for fedora.linux_system_roles.network_state 33192 1726883090.99136: worker is 1 (out of 1 available) 33192 1726883090.99147: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 33192 1726883090.99157: done queuing things up, now waiting for results queue to drain 33192 1726883090.99159: waiting for pending results... 33192 1726883090.99307: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 33192 1726883090.99392: in run() - task 0affe814-3a2d-6c15-6a7e-000000000028 33192 1726883090.99440: variable 'ansible_search_path' from source: unknown 33192 1726883090.99443: variable 'ansible_search_path' from source: unknown 33192 1726883090.99446: calling self._execute() 33192 1726883090.99492: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883090.99496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883090.99512: variable 'omit' from source: magic vars 33192 1726883090.99782: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.99792: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883090.99889: variable 'ansible_distribution_major_version' from source: facts 33192 1726883090.99893: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883090.99898: when evaluation is False, skipping this task 33192 1726883090.99900: _execute() done 33192 1726883090.99905: dumping result to json 33192 1726883090.99910: done dumping result, returning 33192 1726883090.99917: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affe814-3a2d-6c15-6a7e-000000000028] 33192 1726883090.99922: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000028 33192 1726883091.00020: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000028 33192 1726883091.00023: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883091.00093: no more pending results, returning what we have 33192 1726883091.00097: results queue empty 33192 1726883091.00098: checking for any_errors_fatal 33192 1726883091.00103: done checking for any_errors_fatal 33192 1726883091.00104: checking for max_fail_percentage 33192 1726883091.00105: done checking for max_fail_percentage 33192 1726883091.00106: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.00108: done checking to see if all hosts have failed 33192 1726883091.00108: getting the remaining hosts for this loop 33192 1726883091.00110: done getting the remaining hosts for this loop 33192 1726883091.00113: getting the next task for host managed_node1 33192 1726883091.00117: done getting next task for host managed_node1 33192 1726883091.00120: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 33192 1726883091.00122: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.00132: getting variables 33192 1726883091.00133: in VariableManager get_vars() 33192 1726883091.00165: Calling all_inventory to load vars for managed_node1 33192 1726883091.00167: Calling groups_inventory to load vars for managed_node1 33192 1726883091.00168: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.00177: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.00179: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.00181: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.00328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.00515: done with get_vars() 33192 1726883091.00523: done getting variables 33192 1726883091.00570: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:44:51 -0400 (0:00:00.016) 0:00:04.406 ****** 33192 1726883091.00594: entering _queue_task() for managed_node1/debug 33192 1726883091.00754: worker is 1 (out of 1 available) 33192 1726883091.00767: exiting _queue_task() for managed_node1/debug 33192 1726883091.00780: done queuing things up, now waiting for results queue to drain 33192 1726883091.00781: waiting for pending results... 33192 1726883091.00923: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 33192 1726883091.01002: in run() - task 0affe814-3a2d-6c15-6a7e-000000000029 33192 1726883091.01020: variable 'ansible_search_path' from source: unknown 33192 1726883091.01024: variable 'ansible_search_path' from source: unknown 33192 1726883091.01048: calling self._execute() 33192 1726883091.01106: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.01114: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.01126: variable 'omit' from source: magic vars 33192 1726883091.01432: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.01444: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.01539: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.01545: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.01549: when evaluation is False, skipping this task 33192 1726883091.01552: _execute() done 33192 1726883091.01557: dumping result to json 33192 1726883091.01560: done dumping result, returning 33192 1726883091.01578: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affe814-3a2d-6c15-6a7e-000000000029] 33192 1726883091.01581: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000029 33192 1726883091.01661: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000029 33192 1726883091.01664: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 33192 1726883091.01716: no more pending results, returning what we have 33192 1726883091.01719: results queue empty 33192 1726883091.01720: checking for any_errors_fatal 33192 1726883091.01724: done checking for any_errors_fatal 33192 1726883091.01725: checking for max_fail_percentage 33192 1726883091.01727: done checking for max_fail_percentage 33192 1726883091.01728: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.01729: done checking to see if all hosts have failed 33192 1726883091.01730: getting the remaining hosts for this loop 33192 1726883091.01731: done getting the remaining hosts for this loop 33192 1726883091.01745: getting the next task for host managed_node1 33192 1726883091.01750: done getting next task for host managed_node1 33192 1726883091.01754: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 33192 1726883091.01757: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.01769: getting variables 33192 1726883091.01770: in VariableManager get_vars() 33192 1726883091.01802: Calling all_inventory to load vars for managed_node1 33192 1726883091.01804: Calling groups_inventory to load vars for managed_node1 33192 1726883091.01806: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.01812: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.01814: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.01816: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.01997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.02182: done with get_vars() 33192 1726883091.02190: done getting variables 33192 1726883091.02232: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:44:51 -0400 (0:00:00.016) 0:00:04.423 ****** 33192 1726883091.02255: entering _queue_task() for managed_node1/debug 33192 1726883091.02412: worker is 1 (out of 1 available) 33192 1726883091.02424: exiting _queue_task() for managed_node1/debug 33192 1726883091.02436: done queuing things up, now waiting for results queue to drain 33192 1726883091.02437: waiting for pending results... 33192 1726883091.02590: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 33192 1726883091.02673: in run() - task 0affe814-3a2d-6c15-6a7e-00000000002a 33192 1726883091.02689: variable 'ansible_search_path' from source: unknown 33192 1726883091.02693: variable 'ansible_search_path' from source: unknown 33192 1726883091.02720: calling self._execute() 33192 1726883091.02780: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.02785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.02797: variable 'omit' from source: magic vars 33192 1726883091.03073: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.03086: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.03183: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.03189: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.03192: when evaluation is False, skipping this task 33192 1726883091.03196: _execute() done 33192 1726883091.03202: dumping result to json 33192 1726883091.03205: done dumping result, returning 33192 1726883091.03214: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affe814-3a2d-6c15-6a7e-00000000002a] 33192 1726883091.03219: sending task result for task 0affe814-3a2d-6c15-6a7e-00000000002a 33192 1726883091.03307: done sending task result for task 0affe814-3a2d-6c15-6a7e-00000000002a 33192 1726883091.03310: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 33192 1726883091.03370: no more pending results, returning what we have 33192 1726883091.03373: results queue empty 33192 1726883091.03374: checking for any_errors_fatal 33192 1726883091.03378: done checking for any_errors_fatal 33192 1726883091.03379: checking for max_fail_percentage 33192 1726883091.03381: done checking for max_fail_percentage 33192 1726883091.03382: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.03383: done checking to see if all hosts have failed 33192 1726883091.03384: getting the remaining hosts for this loop 33192 1726883091.03385: done getting the remaining hosts for this loop 33192 1726883091.03389: getting the next task for host managed_node1 33192 1726883091.03394: done getting next task for host managed_node1 33192 1726883091.03398: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 33192 1726883091.03401: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.03414: getting variables 33192 1726883091.03416: in VariableManager get_vars() 33192 1726883091.03456: Calling all_inventory to load vars for managed_node1 33192 1726883091.03458: Calling groups_inventory to load vars for managed_node1 33192 1726883091.03460: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.03466: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.03468: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.03471: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.03616: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.03972: done with get_vars() 33192 1726883091.03981: done getting variables 33192 1726883091.04024: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:44:51 -0400 (0:00:00.017) 0:00:04.440 ****** 33192 1726883091.04048: entering _queue_task() for managed_node1/debug 33192 1726883091.04205: worker is 1 (out of 1 available) 33192 1726883091.04218: exiting _queue_task() for managed_node1/debug 33192 1726883091.04229: done queuing things up, now waiting for results queue to drain 33192 1726883091.04230: waiting for pending results... 33192 1726883091.04380: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 33192 1726883091.04478: in run() - task 0affe814-3a2d-6c15-6a7e-00000000002b 33192 1726883091.04491: variable 'ansible_search_path' from source: unknown 33192 1726883091.04495: variable 'ansible_search_path' from source: unknown 33192 1726883091.04523: calling self._execute() 33192 1726883091.04586: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.04594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.04604: variable 'omit' from source: magic vars 33192 1726883091.04875: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.04888: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.04985: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.04989: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.04994: when evaluation is False, skipping this task 33192 1726883091.04997: _execute() done 33192 1726883091.05004: dumping result to json 33192 1726883091.05007: done dumping result, returning 33192 1726883091.05022: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affe814-3a2d-6c15-6a7e-00000000002b] 33192 1726883091.05025: sending task result for task 0affe814-3a2d-6c15-6a7e-00000000002b 33192 1726883091.05112: done sending task result for task 0affe814-3a2d-6c15-6a7e-00000000002b 33192 1726883091.05118: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 33192 1726883091.05165: no more pending results, returning what we have 33192 1726883091.05168: results queue empty 33192 1726883091.05169: checking for any_errors_fatal 33192 1726883091.05174: done checking for any_errors_fatal 33192 1726883091.05175: checking for max_fail_percentage 33192 1726883091.05177: done checking for max_fail_percentage 33192 1726883091.05178: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.05179: done checking to see if all hosts have failed 33192 1726883091.05180: getting the remaining hosts for this loop 33192 1726883091.05182: done getting the remaining hosts for this loop 33192 1726883091.05185: getting the next task for host managed_node1 33192 1726883091.05190: done getting next task for host managed_node1 33192 1726883091.05194: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 33192 1726883091.05197: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.05211: getting variables 33192 1726883091.05212: in VariableManager get_vars() 33192 1726883091.05253: Calling all_inventory to load vars for managed_node1 33192 1726883091.05255: Calling groups_inventory to load vars for managed_node1 33192 1726883091.05257: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.05265: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.05267: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.05269: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.05417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.05614: done with get_vars() 33192 1726883091.05623: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:44:51 -0400 (0:00:00.016) 0:00:04.457 ****** 33192 1726883091.05698: entering _queue_task() for managed_node1/ping 33192 1726883091.05699: Creating lock for ping 33192 1726883091.05865: worker is 1 (out of 1 available) 33192 1726883091.05880: exiting _queue_task() for managed_node1/ping 33192 1726883091.05890: done queuing things up, now waiting for results queue to drain 33192 1726883091.05891: waiting for pending results... 33192 1726883091.06049: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 33192 1726883091.06133: in run() - task 0affe814-3a2d-6c15-6a7e-00000000002c 33192 1726883091.06144: variable 'ansible_search_path' from source: unknown 33192 1726883091.06148: variable 'ansible_search_path' from source: unknown 33192 1726883091.06233: calling self._execute() 33192 1726883091.06246: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.06250: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.06256: variable 'omit' from source: magic vars 33192 1726883091.06538: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.06551: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.06645: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.06650: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.06656: when evaluation is False, skipping this task 33192 1726883091.06659: _execute() done 33192 1726883091.06662: dumping result to json 33192 1726883091.06678: done dumping result, returning 33192 1726883091.06682: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affe814-3a2d-6c15-6a7e-00000000002c] 33192 1726883091.06684: sending task result for task 0affe814-3a2d-6c15-6a7e-00000000002c 33192 1726883091.06763: done sending task result for task 0affe814-3a2d-6c15-6a7e-00000000002c 33192 1726883091.06766: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883091.06819: no more pending results, returning what we have 33192 1726883091.06823: results queue empty 33192 1726883091.06824: checking for any_errors_fatal 33192 1726883091.06830: done checking for any_errors_fatal 33192 1726883091.06831: checking for max_fail_percentage 33192 1726883091.06833: done checking for max_fail_percentage 33192 1726883091.06836: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.06837: done checking to see if all hosts have failed 33192 1726883091.06838: getting the remaining hosts for this loop 33192 1726883091.06839: done getting the remaining hosts for this loop 33192 1726883091.06843: getting the next task for host managed_node1 33192 1726883091.06852: done getting next task for host managed_node1 33192 1726883091.06854: ^ task is: TASK: meta (role_complete) 33192 1726883091.06857: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.06874: getting variables 33192 1726883091.06876: in VariableManager get_vars() 33192 1726883091.06909: Calling all_inventory to load vars for managed_node1 33192 1726883091.06911: Calling groups_inventory to load vars for managed_node1 33192 1726883091.06913: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.06919: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.06921: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.06923: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.07107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.07290: done with get_vars() 33192 1726883091.07298: done getting variables 33192 1726883091.07359: done queuing things up, now waiting for results queue to drain 33192 1726883091.07361: results queue empty 33192 1726883091.07361: checking for any_errors_fatal 33192 1726883091.07363: done checking for any_errors_fatal 33192 1726883091.07364: checking for max_fail_percentage 33192 1726883091.07364: done checking for max_fail_percentage 33192 1726883091.07365: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.07366: done checking to see if all hosts have failed 33192 1726883091.07366: getting the remaining hosts for this loop 33192 1726883091.07367: done getting the remaining hosts for this loop 33192 1726883091.07369: getting the next task for host managed_node1 33192 1726883091.07374: done getting next task for host managed_node1 33192 1726883091.07376: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 33192 1726883091.07378: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.07385: getting variables 33192 1726883091.07386: in VariableManager get_vars() 33192 1726883091.07399: Calling all_inventory to load vars for managed_node1 33192 1726883091.07401: Calling groups_inventory to load vars for managed_node1 33192 1726883091.07402: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.07405: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.07407: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.07409: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.07535: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.07716: done with get_vars() 33192 1726883091.07723: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:44:51 -0400 (0:00:00.020) 0:00:04.478 ****** 33192 1726883091.07784: entering _queue_task() for managed_node1/include_tasks 33192 1726883091.07946: worker is 1 (out of 1 available) 33192 1726883091.07958: exiting _queue_task() for managed_node1/include_tasks 33192 1726883091.07969: done queuing things up, now waiting for results queue to drain 33192 1726883091.07970: waiting for pending results... 33192 1726883091.08114: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 33192 1726883091.08214: in run() - task 0affe814-3a2d-6c15-6a7e-000000000063 33192 1726883091.08227: variable 'ansible_search_path' from source: unknown 33192 1726883091.08231: variable 'ansible_search_path' from source: unknown 33192 1726883091.08262: calling self._execute() 33192 1726883091.08327: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.08335: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.08348: variable 'omit' from source: magic vars 33192 1726883091.08622: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.08638: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.08728: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.08732: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.08741: when evaluation is False, skipping this task 33192 1726883091.08746: _execute() done 33192 1726883091.08749: dumping result to json 33192 1726883091.08752: done dumping result, returning 33192 1726883091.08765: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affe814-3a2d-6c15-6a7e-000000000063] 33192 1726883091.08768: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000063 33192 1726883091.08858: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000063 33192 1726883091.08863: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883091.08909: no more pending results, returning what we have 33192 1726883091.08912: results queue empty 33192 1726883091.08914: checking for any_errors_fatal 33192 1726883091.08915: done checking for any_errors_fatal 33192 1726883091.08916: checking for max_fail_percentage 33192 1726883091.08918: done checking for max_fail_percentage 33192 1726883091.08919: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.08921: done checking to see if all hosts have failed 33192 1726883091.08921: getting the remaining hosts for this loop 33192 1726883091.08923: done getting the remaining hosts for this loop 33192 1726883091.08927: getting the next task for host managed_node1 33192 1726883091.08932: done getting next task for host managed_node1 33192 1726883091.08937: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 33192 1726883091.08941: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.08955: getting variables 33192 1726883091.08957: in VariableManager get_vars() 33192 1726883091.08992: Calling all_inventory to load vars for managed_node1 33192 1726883091.08994: Calling groups_inventory to load vars for managed_node1 33192 1726883091.08995: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.09002: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.09004: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.09006: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.09179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.09366: done with get_vars() 33192 1726883091.09374: done getting variables 33192 1726883091.09421: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:44:51 -0400 (0:00:00.016) 0:00:04.494 ****** 33192 1726883091.09444: entering _queue_task() for managed_node1/debug 33192 1726883091.09601: worker is 1 (out of 1 available) 33192 1726883091.09613: exiting _queue_task() for managed_node1/debug 33192 1726883091.09625: done queuing things up, now waiting for results queue to drain 33192 1726883091.09626: waiting for pending results... 33192 1726883091.09784: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 33192 1726883091.09868: in run() - task 0affe814-3a2d-6c15-6a7e-000000000064 33192 1726883091.09882: variable 'ansible_search_path' from source: unknown 33192 1726883091.09885: variable 'ansible_search_path' from source: unknown 33192 1726883091.09913: calling self._execute() 33192 1726883091.10039: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.10043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.10045: variable 'omit' from source: magic vars 33192 1726883091.10396: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.10414: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.10560: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.10574: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.10584: when evaluation is False, skipping this task 33192 1726883091.10594: _execute() done 33192 1726883091.10603: dumping result to json 33192 1726883091.10614: done dumping result, returning 33192 1726883091.10626: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0affe814-3a2d-6c15-6a7e-000000000064] 33192 1726883091.10645: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000064 33192 1726883091.10903: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000064 33192 1726883091.10907: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 33192 1726883091.10949: no more pending results, returning what we have 33192 1726883091.10953: results queue empty 33192 1726883091.10954: checking for any_errors_fatal 33192 1726883091.10959: done checking for any_errors_fatal 33192 1726883091.10960: checking for max_fail_percentage 33192 1726883091.10962: done checking for max_fail_percentage 33192 1726883091.10963: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.10964: done checking to see if all hosts have failed 33192 1726883091.10965: getting the remaining hosts for this loop 33192 1726883091.10967: done getting the remaining hosts for this loop 33192 1726883091.10970: getting the next task for host managed_node1 33192 1726883091.10976: done getting next task for host managed_node1 33192 1726883091.10980: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 33192 1726883091.10983: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.11000: getting variables 33192 1726883091.11002: in VariableManager get_vars() 33192 1726883091.11048: Calling all_inventory to load vars for managed_node1 33192 1726883091.11052: Calling groups_inventory to load vars for managed_node1 33192 1726883091.11055: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.11065: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.11068: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.11072: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.11330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.11657: done with get_vars() 33192 1726883091.11669: done getting variables 33192 1726883091.11729: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:44:51 -0400 (0:00:00.023) 0:00:04.518 ****** 33192 1726883091.11767: entering _queue_task() for managed_node1/fail 33192 1726883091.11975: worker is 1 (out of 1 available) 33192 1726883091.11988: exiting _queue_task() for managed_node1/fail 33192 1726883091.12000: done queuing things up, now waiting for results queue to drain 33192 1726883091.12001: waiting for pending results... 33192 1726883091.12258: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 33192 1726883091.12414: in run() - task 0affe814-3a2d-6c15-6a7e-000000000065 33192 1726883091.12461: variable 'ansible_search_path' from source: unknown 33192 1726883091.12466: variable 'ansible_search_path' from source: unknown 33192 1726883091.12496: calling self._execute() 33192 1726883091.12588: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.12640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.12644: variable 'omit' from source: magic vars 33192 1726883091.13039: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.13060: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.13211: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.13231: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.13440: when evaluation is False, skipping this task 33192 1726883091.13443: _execute() done 33192 1726883091.13446: dumping result to json 33192 1726883091.13449: done dumping result, returning 33192 1726883091.13452: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affe814-3a2d-6c15-6a7e-000000000065] 33192 1726883091.13454: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000065 33192 1726883091.13516: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000065 33192 1726883091.13519: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883091.13562: no more pending results, returning what we have 33192 1726883091.13566: results queue empty 33192 1726883091.13567: checking for any_errors_fatal 33192 1726883091.13572: done checking for any_errors_fatal 33192 1726883091.13573: checking for max_fail_percentage 33192 1726883091.13575: done checking for max_fail_percentage 33192 1726883091.13576: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.13577: done checking to see if all hosts have failed 33192 1726883091.13578: getting the remaining hosts for this loop 33192 1726883091.13579: done getting the remaining hosts for this loop 33192 1726883091.13583: getting the next task for host managed_node1 33192 1726883091.13588: done getting next task for host managed_node1 33192 1726883091.13592: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 33192 1726883091.13595: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.13612: getting variables 33192 1726883091.13614: in VariableManager get_vars() 33192 1726883091.13658: Calling all_inventory to load vars for managed_node1 33192 1726883091.13662: Calling groups_inventory to load vars for managed_node1 33192 1726883091.13664: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.13674: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.13678: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.13682: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.13999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.14324: done with get_vars() 33192 1726883091.14338: done getting variables 33192 1726883091.14402: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:44:51 -0400 (0:00:00.026) 0:00:04.544 ****** 33192 1726883091.14439: entering _queue_task() for managed_node1/fail 33192 1726883091.14658: worker is 1 (out of 1 available) 33192 1726883091.14671: exiting _queue_task() for managed_node1/fail 33192 1726883091.14681: done queuing things up, now waiting for results queue to drain 33192 1726883091.14683: waiting for pending results... 33192 1726883091.14941: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 33192 1726883091.15085: in run() - task 0affe814-3a2d-6c15-6a7e-000000000066 33192 1726883091.15108: variable 'ansible_search_path' from source: unknown 33192 1726883091.15117: variable 'ansible_search_path' from source: unknown 33192 1726883091.15159: calling self._execute() 33192 1726883091.15252: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.15268: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.15332: variable 'omit' from source: magic vars 33192 1726883091.15601: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.15604: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.15690: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.15693: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.15699: when evaluation is False, skipping this task 33192 1726883091.15704: _execute() done 33192 1726883091.15707: dumping result to json 33192 1726883091.15713: done dumping result, returning 33192 1726883091.15720: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affe814-3a2d-6c15-6a7e-000000000066] 33192 1726883091.15725: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000066 33192 1726883091.15823: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000066 33192 1726883091.15827: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883091.15879: no more pending results, returning what we have 33192 1726883091.15882: results queue empty 33192 1726883091.15883: checking for any_errors_fatal 33192 1726883091.15888: done checking for any_errors_fatal 33192 1726883091.15888: checking for max_fail_percentage 33192 1726883091.15890: done checking for max_fail_percentage 33192 1726883091.15891: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.15893: done checking to see if all hosts have failed 33192 1726883091.15894: getting the remaining hosts for this loop 33192 1726883091.15895: done getting the remaining hosts for this loop 33192 1726883091.15898: getting the next task for host managed_node1 33192 1726883091.15903: done getting next task for host managed_node1 33192 1726883091.15907: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 33192 1726883091.15910: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.15925: getting variables 33192 1726883091.15926: in VariableManager get_vars() 33192 1726883091.15959: Calling all_inventory to load vars for managed_node1 33192 1726883091.15962: Calling groups_inventory to load vars for managed_node1 33192 1726883091.15963: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.15970: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.15974: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.15976: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.16123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.16312: done with get_vars() 33192 1726883091.16320: done getting variables 33192 1726883091.16367: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:44:51 -0400 (0:00:00.019) 0:00:04.564 ****** 33192 1726883091.16392: entering _queue_task() for managed_node1/fail 33192 1726883091.16553: worker is 1 (out of 1 available) 33192 1726883091.16565: exiting _queue_task() for managed_node1/fail 33192 1726883091.16580: done queuing things up, now waiting for results queue to drain 33192 1726883091.16581: waiting for pending results... 33192 1726883091.16727: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 33192 1726883091.16822: in run() - task 0affe814-3a2d-6c15-6a7e-000000000067 33192 1726883091.16827: variable 'ansible_search_path' from source: unknown 33192 1726883091.16829: variable 'ansible_search_path' from source: unknown 33192 1726883091.16858: calling self._execute() 33192 1726883091.16919: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.16932: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.16943: variable 'omit' from source: magic vars 33192 1726883091.17293: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.17303: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.17401: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.17405: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.17410: when evaluation is False, skipping this task 33192 1726883091.17413: _execute() done 33192 1726883091.17418: dumping result to json 33192 1726883091.17423: done dumping result, returning 33192 1726883091.17430: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affe814-3a2d-6c15-6a7e-000000000067] 33192 1726883091.17437: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000067 33192 1726883091.17524: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000067 33192 1726883091.17527: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883091.17601: no more pending results, returning what we have 33192 1726883091.17604: results queue empty 33192 1726883091.17605: checking for any_errors_fatal 33192 1726883091.17611: done checking for any_errors_fatal 33192 1726883091.17612: checking for max_fail_percentage 33192 1726883091.17614: done checking for max_fail_percentage 33192 1726883091.17615: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.17616: done checking to see if all hosts have failed 33192 1726883091.17617: getting the remaining hosts for this loop 33192 1726883091.17619: done getting the remaining hosts for this loop 33192 1726883091.17622: getting the next task for host managed_node1 33192 1726883091.17626: done getting next task for host managed_node1 33192 1726883091.17629: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 33192 1726883091.17631: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.17649: getting variables 33192 1726883091.17650: in VariableManager get_vars() 33192 1726883091.17682: Calling all_inventory to load vars for managed_node1 33192 1726883091.17684: Calling groups_inventory to load vars for managed_node1 33192 1726883091.17686: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.17692: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.17694: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.17697: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.17874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.18056: done with get_vars() 33192 1726883091.18064: done getting variables 33192 1726883091.18110: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:44:51 -0400 (0:00:00.017) 0:00:04.581 ****** 33192 1726883091.18131: entering _queue_task() for managed_node1/dnf 33192 1726883091.18293: worker is 1 (out of 1 available) 33192 1726883091.18305: exiting _queue_task() for managed_node1/dnf 33192 1726883091.18316: done queuing things up, now waiting for results queue to drain 33192 1726883091.18317: waiting for pending results... 33192 1726883091.18472: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 33192 1726883091.18558: in run() - task 0affe814-3a2d-6c15-6a7e-000000000068 33192 1726883091.18570: variable 'ansible_search_path' from source: unknown 33192 1726883091.18574: variable 'ansible_search_path' from source: unknown 33192 1726883091.18605: calling self._execute() 33192 1726883091.18667: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.18678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.18688: variable 'omit' from source: magic vars 33192 1726883091.18960: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.18972: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.19069: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.19077: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.19082: when evaluation is False, skipping this task 33192 1726883091.19085: _execute() done 33192 1726883091.19088: dumping result to json 33192 1726883091.19095: done dumping result, returning 33192 1726883091.19107: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affe814-3a2d-6c15-6a7e-000000000068] 33192 1726883091.19110: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000068 33192 1726883091.19200: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000068 33192 1726883091.19203: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883091.19256: no more pending results, returning what we have 33192 1726883091.19259: results queue empty 33192 1726883091.19260: checking for any_errors_fatal 33192 1726883091.19265: done checking for any_errors_fatal 33192 1726883091.19266: checking for max_fail_percentage 33192 1726883091.19268: done checking for max_fail_percentage 33192 1726883091.19269: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.19270: done checking to see if all hosts have failed 33192 1726883091.19271: getting the remaining hosts for this loop 33192 1726883091.19272: done getting the remaining hosts for this loop 33192 1726883091.19275: getting the next task for host managed_node1 33192 1726883091.19281: done getting next task for host managed_node1 33192 1726883091.19285: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 33192 1726883091.19288: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.19302: getting variables 33192 1726883091.19304: in VariableManager get_vars() 33192 1726883091.19338: Calling all_inventory to load vars for managed_node1 33192 1726883091.19341: Calling groups_inventory to load vars for managed_node1 33192 1726883091.19342: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.19349: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.19351: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.19354: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.19500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.19710: done with get_vars() 33192 1726883091.19717: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 33192 1726883091.19777: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:44:51 -0400 (0:00:00.016) 0:00:04.598 ****** 33192 1726883091.19797: entering _queue_task() for managed_node1/yum 33192 1726883091.19960: worker is 1 (out of 1 available) 33192 1726883091.19972: exiting _queue_task() for managed_node1/yum 33192 1726883091.19983: done queuing things up, now waiting for results queue to drain 33192 1726883091.19984: waiting for pending results... 33192 1726883091.20133: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 33192 1726883091.20217: in run() - task 0affe814-3a2d-6c15-6a7e-000000000069 33192 1726883091.20230: variable 'ansible_search_path' from source: unknown 33192 1726883091.20233: variable 'ansible_search_path' from source: unknown 33192 1726883091.20264: calling self._execute() 33192 1726883091.20330: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.20334: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.20349: variable 'omit' from source: magic vars 33192 1726883091.20619: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.20628: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.20727: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.20733: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.20738: when evaluation is False, skipping this task 33192 1726883091.20743: _execute() done 33192 1726883091.20746: dumping result to json 33192 1726883091.20751: done dumping result, returning 33192 1726883091.20762: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affe814-3a2d-6c15-6a7e-000000000069] 33192 1726883091.20765: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000069 33192 1726883091.20864: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000069 33192 1726883091.20868: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883091.20921: no more pending results, returning what we have 33192 1726883091.20925: results queue empty 33192 1726883091.20926: checking for any_errors_fatal 33192 1726883091.20931: done checking for any_errors_fatal 33192 1726883091.20932: checking for max_fail_percentage 33192 1726883091.20936: done checking for max_fail_percentage 33192 1726883091.20937: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.20938: done checking to see if all hosts have failed 33192 1726883091.20939: getting the remaining hosts for this loop 33192 1726883091.20941: done getting the remaining hosts for this loop 33192 1726883091.20944: getting the next task for host managed_node1 33192 1726883091.20949: done getting next task for host managed_node1 33192 1726883091.20953: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 33192 1726883091.20956: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.20970: getting variables 33192 1726883091.20972: in VariableManager get_vars() 33192 1726883091.21006: Calling all_inventory to load vars for managed_node1 33192 1726883091.21008: Calling groups_inventory to load vars for managed_node1 33192 1726883091.21009: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.21016: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.21018: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.21020: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.21167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.21356: done with get_vars() 33192 1726883091.21364: done getting variables 33192 1726883091.21408: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:44:51 -0400 (0:00:00.016) 0:00:04.614 ****** 33192 1726883091.21432: entering _queue_task() for managed_node1/fail 33192 1726883091.21595: worker is 1 (out of 1 available) 33192 1726883091.21608: exiting _queue_task() for managed_node1/fail 33192 1726883091.21620: done queuing things up, now waiting for results queue to drain 33192 1726883091.21621: waiting for pending results... 33192 1726883091.21774: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 33192 1726883091.21866: in run() - task 0affe814-3a2d-6c15-6a7e-00000000006a 33192 1726883091.21939: variable 'ansible_search_path' from source: unknown 33192 1726883091.21943: variable 'ansible_search_path' from source: unknown 33192 1726883091.21946: calling self._execute() 33192 1726883091.21967: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.21981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.21990: variable 'omit' from source: magic vars 33192 1726883091.22279: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.22290: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.22389: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.22394: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.22397: when evaluation is False, skipping this task 33192 1726883091.22400: _execute() done 33192 1726883091.22407: dumping result to json 33192 1726883091.22410: done dumping result, returning 33192 1726883091.22421: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-6c15-6a7e-00000000006a] 33192 1726883091.22425: sending task result for task 0affe814-3a2d-6c15-6a7e-00000000006a 33192 1726883091.22521: done sending task result for task 0affe814-3a2d-6c15-6a7e-00000000006a 33192 1726883091.22524: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883091.22577: no more pending results, returning what we have 33192 1726883091.22580: results queue empty 33192 1726883091.22581: checking for any_errors_fatal 33192 1726883091.22586: done checking for any_errors_fatal 33192 1726883091.22587: checking for max_fail_percentage 33192 1726883091.22589: done checking for max_fail_percentage 33192 1726883091.22590: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.22591: done checking to see if all hosts have failed 33192 1726883091.22592: getting the remaining hosts for this loop 33192 1726883091.22593: done getting the remaining hosts for this loop 33192 1726883091.22597: getting the next task for host managed_node1 33192 1726883091.22602: done getting next task for host managed_node1 33192 1726883091.22605: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 33192 1726883091.22608: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.22623: getting variables 33192 1726883091.22625: in VariableManager get_vars() 33192 1726883091.22659: Calling all_inventory to load vars for managed_node1 33192 1726883091.22661: Calling groups_inventory to load vars for managed_node1 33192 1726883091.22663: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.22669: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.22674: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.22676: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.22853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.23041: done with get_vars() 33192 1726883091.23048: done getting variables 33192 1726883091.23095: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:44:51 -0400 (0:00:00.016) 0:00:04.631 ****** 33192 1726883091.23118: entering _queue_task() for managed_node1/package 33192 1726883091.23285: worker is 1 (out of 1 available) 33192 1726883091.23297: exiting _queue_task() for managed_node1/package 33192 1726883091.23309: done queuing things up, now waiting for results queue to drain 33192 1726883091.23310: waiting for pending results... 33192 1726883091.23458: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 33192 1726883091.23544: in run() - task 0affe814-3a2d-6c15-6a7e-00000000006b 33192 1726883091.23557: variable 'ansible_search_path' from source: unknown 33192 1726883091.23561: variable 'ansible_search_path' from source: unknown 33192 1726883091.23590: calling self._execute() 33192 1726883091.23658: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.23668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.23740: variable 'omit' from source: magic vars 33192 1726883091.23947: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.23957: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.24054: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.24058: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.24063: when evaluation is False, skipping this task 33192 1726883091.24065: _execute() done 33192 1726883091.24074: dumping result to json 33192 1726883091.24077: done dumping result, returning 33192 1726883091.24085: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0affe814-3a2d-6c15-6a7e-00000000006b] 33192 1726883091.24088: sending task result for task 0affe814-3a2d-6c15-6a7e-00000000006b 33192 1726883091.24183: done sending task result for task 0affe814-3a2d-6c15-6a7e-00000000006b 33192 1726883091.24188: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883091.24240: no more pending results, returning what we have 33192 1726883091.24243: results queue empty 33192 1726883091.24244: checking for any_errors_fatal 33192 1726883091.24251: done checking for any_errors_fatal 33192 1726883091.24252: checking for max_fail_percentage 33192 1726883091.24253: done checking for max_fail_percentage 33192 1726883091.24254: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.24256: done checking to see if all hosts have failed 33192 1726883091.24256: getting the remaining hosts for this loop 33192 1726883091.24258: done getting the remaining hosts for this loop 33192 1726883091.24261: getting the next task for host managed_node1 33192 1726883091.24267: done getting next task for host managed_node1 33192 1726883091.24273: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 33192 1726883091.24276: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.24291: getting variables 33192 1726883091.24292: in VariableManager get_vars() 33192 1726883091.24323: Calling all_inventory to load vars for managed_node1 33192 1726883091.24325: Calling groups_inventory to load vars for managed_node1 33192 1726883091.24326: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.24333: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.24337: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.24340: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.24490: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.24678: done with get_vars() 33192 1726883091.24686: done getting variables 33192 1726883091.24730: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:44:51 -0400 (0:00:00.016) 0:00:04.648 ****** 33192 1726883091.24754: entering _queue_task() for managed_node1/package 33192 1726883091.24920: worker is 1 (out of 1 available) 33192 1726883091.24933: exiting _queue_task() for managed_node1/package 33192 1726883091.24946: done queuing things up, now waiting for results queue to drain 33192 1726883091.24948: waiting for pending results... 33192 1726883091.25094: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 33192 1726883091.25168: in run() - task 0affe814-3a2d-6c15-6a7e-00000000006c 33192 1726883091.25180: variable 'ansible_search_path' from source: unknown 33192 1726883091.25192: variable 'ansible_search_path' from source: unknown 33192 1726883091.25220: calling self._execute() 33192 1726883091.25288: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.25297: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.25308: variable 'omit' from source: magic vars 33192 1726883091.25597: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.25607: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.25709: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.25714: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.25721: when evaluation is False, skipping this task 33192 1726883091.25724: _execute() done 33192 1726883091.25726: dumping result to json 33192 1726883091.25742: done dumping result, returning 33192 1726883091.25747: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affe814-3a2d-6c15-6a7e-00000000006c] 33192 1726883091.25750: sending task result for task 0affe814-3a2d-6c15-6a7e-00000000006c 33192 1726883091.25843: done sending task result for task 0affe814-3a2d-6c15-6a7e-00000000006c 33192 1726883091.25848: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883091.25903: no more pending results, returning what we have 33192 1726883091.25907: results queue empty 33192 1726883091.25908: checking for any_errors_fatal 33192 1726883091.25913: done checking for any_errors_fatal 33192 1726883091.25913: checking for max_fail_percentage 33192 1726883091.25915: done checking for max_fail_percentage 33192 1726883091.25916: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.25917: done checking to see if all hosts have failed 33192 1726883091.25918: getting the remaining hosts for this loop 33192 1726883091.25919: done getting the remaining hosts for this loop 33192 1726883091.25923: getting the next task for host managed_node1 33192 1726883091.25929: done getting next task for host managed_node1 33192 1726883091.25932: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 33192 1726883091.25938: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.25953: getting variables 33192 1726883091.25954: in VariableManager get_vars() 33192 1726883091.25993: Calling all_inventory to load vars for managed_node1 33192 1726883091.25996: Calling groups_inventory to load vars for managed_node1 33192 1726883091.25997: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.26004: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.26006: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.26008: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.26194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.26407: done with get_vars() 33192 1726883091.26415: done getting variables 33192 1726883091.26462: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:44:51 -0400 (0:00:00.017) 0:00:04.665 ****** 33192 1726883091.26489: entering _queue_task() for managed_node1/package 33192 1726883091.26658: worker is 1 (out of 1 available) 33192 1726883091.26675: exiting _queue_task() for managed_node1/package 33192 1726883091.26687: done queuing things up, now waiting for results queue to drain 33192 1726883091.26688: waiting for pending results... 33192 1726883091.26845: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 33192 1726883091.26936: in run() - task 0affe814-3a2d-6c15-6a7e-00000000006d 33192 1726883091.26949: variable 'ansible_search_path' from source: unknown 33192 1726883091.26953: variable 'ansible_search_path' from source: unknown 33192 1726883091.26984: calling self._execute() 33192 1726883091.27051: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.27058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.27067: variable 'omit' from source: magic vars 33192 1726883091.27357: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.27368: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.27463: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.27468: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.27474: when evaluation is False, skipping this task 33192 1726883091.27477: _execute() done 33192 1726883091.27480: dumping result to json 33192 1726883091.27483: done dumping result, returning 33192 1726883091.27493: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affe814-3a2d-6c15-6a7e-00000000006d] 33192 1726883091.27496: sending task result for task 0affe814-3a2d-6c15-6a7e-00000000006d 33192 1726883091.27594: done sending task result for task 0affe814-3a2d-6c15-6a7e-00000000006d 33192 1726883091.27597: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883091.27643: no more pending results, returning what we have 33192 1726883091.27646: results queue empty 33192 1726883091.27647: checking for any_errors_fatal 33192 1726883091.27653: done checking for any_errors_fatal 33192 1726883091.27654: checking for max_fail_percentage 33192 1726883091.27657: done checking for max_fail_percentage 33192 1726883091.27658: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.27659: done checking to see if all hosts have failed 33192 1726883091.27660: getting the remaining hosts for this loop 33192 1726883091.27661: done getting the remaining hosts for this loop 33192 1726883091.27664: getting the next task for host managed_node1 33192 1726883091.27669: done getting next task for host managed_node1 33192 1726883091.27675: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 33192 1726883091.27679: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.27698: getting variables 33192 1726883091.27700: in VariableManager get_vars() 33192 1726883091.27731: Calling all_inventory to load vars for managed_node1 33192 1726883091.27732: Calling groups_inventory to load vars for managed_node1 33192 1726883091.27736: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.27743: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.27745: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.27747: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.27895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.28085: done with get_vars() 33192 1726883091.28093: done getting variables 33192 1726883091.28139: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:44:51 -0400 (0:00:00.016) 0:00:04.682 ****** 33192 1726883091.28163: entering _queue_task() for managed_node1/service 33192 1726883091.28330: worker is 1 (out of 1 available) 33192 1726883091.28347: exiting _queue_task() for managed_node1/service 33192 1726883091.28358: done queuing things up, now waiting for results queue to drain 33192 1726883091.28359: waiting for pending results... 33192 1726883091.28507: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 33192 1726883091.28594: in run() - task 0affe814-3a2d-6c15-6a7e-00000000006e 33192 1726883091.28607: variable 'ansible_search_path' from source: unknown 33192 1726883091.28611: variable 'ansible_search_path' from source: unknown 33192 1726883091.28641: calling self._execute() 33192 1726883091.28702: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.28709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.28725: variable 'omit' from source: magic vars 33192 1726883091.29065: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.29077: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.29172: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.29180: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.29183: when evaluation is False, skipping this task 33192 1726883091.29186: _execute() done 33192 1726883091.29191: dumping result to json 33192 1726883091.29196: done dumping result, returning 33192 1726883091.29204: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-6c15-6a7e-00000000006e] 33192 1726883091.29209: sending task result for task 0affe814-3a2d-6c15-6a7e-00000000006e 33192 1726883091.29302: done sending task result for task 0affe814-3a2d-6c15-6a7e-00000000006e 33192 1726883091.29305: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883091.29350: no more pending results, returning what we have 33192 1726883091.29354: results queue empty 33192 1726883091.29355: checking for any_errors_fatal 33192 1726883091.29362: done checking for any_errors_fatal 33192 1726883091.29363: checking for max_fail_percentage 33192 1726883091.29365: done checking for max_fail_percentage 33192 1726883091.29366: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.29367: done checking to see if all hosts have failed 33192 1726883091.29368: getting the remaining hosts for this loop 33192 1726883091.29370: done getting the remaining hosts for this loop 33192 1726883091.29373: getting the next task for host managed_node1 33192 1726883091.29379: done getting next task for host managed_node1 33192 1726883091.29382: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 33192 1726883091.29385: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.29401: getting variables 33192 1726883091.29403: in VariableManager get_vars() 33192 1726883091.29439: Calling all_inventory to load vars for managed_node1 33192 1726883091.29441: Calling groups_inventory to load vars for managed_node1 33192 1726883091.29443: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.29450: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.29452: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.29454: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.29625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.29812: done with get_vars() 33192 1726883091.29820: done getting variables 33192 1726883091.29867: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:44:51 -0400 (0:00:00.017) 0:00:04.699 ****** 33192 1726883091.29890: entering _queue_task() for managed_node1/service 33192 1726883091.30053: worker is 1 (out of 1 available) 33192 1726883091.30068: exiting _queue_task() for managed_node1/service 33192 1726883091.30079: done queuing things up, now waiting for results queue to drain 33192 1726883091.30081: waiting for pending results... 33192 1726883091.30229: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 33192 1726883091.30313: in run() - task 0affe814-3a2d-6c15-6a7e-00000000006f 33192 1726883091.30325: variable 'ansible_search_path' from source: unknown 33192 1726883091.30328: variable 'ansible_search_path' from source: unknown 33192 1726883091.30356: calling self._execute() 33192 1726883091.30421: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.30427: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.30438: variable 'omit' from source: magic vars 33192 1726883091.30705: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.30714: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.30814: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.30818: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.30823: when evaluation is False, skipping this task 33192 1726883091.30826: _execute() done 33192 1726883091.30831: dumping result to json 33192 1726883091.30837: done dumping result, returning 33192 1726883091.30845: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affe814-3a2d-6c15-6a7e-00000000006f] 33192 1726883091.30848: sending task result for task 0affe814-3a2d-6c15-6a7e-00000000006f 33192 1726883091.30941: done sending task result for task 0affe814-3a2d-6c15-6a7e-00000000006f 33192 1726883091.30944: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 33192 1726883091.30998: no more pending results, returning what we have 33192 1726883091.31002: results queue empty 33192 1726883091.31003: checking for any_errors_fatal 33192 1726883091.31008: done checking for any_errors_fatal 33192 1726883091.31009: checking for max_fail_percentage 33192 1726883091.31011: done checking for max_fail_percentage 33192 1726883091.31012: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.31013: done checking to see if all hosts have failed 33192 1726883091.31014: getting the remaining hosts for this loop 33192 1726883091.31015: done getting the remaining hosts for this loop 33192 1726883091.31019: getting the next task for host managed_node1 33192 1726883091.31025: done getting next task for host managed_node1 33192 1726883091.31028: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 33192 1726883091.31031: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.31048: getting variables 33192 1726883091.31049: in VariableManager get_vars() 33192 1726883091.31081: Calling all_inventory to load vars for managed_node1 33192 1726883091.31083: Calling groups_inventory to load vars for managed_node1 33192 1726883091.31084: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.31090: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.31092: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.31095: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.31241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.31446: done with get_vars() 33192 1726883091.31453: done getting variables 33192 1726883091.31500: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:44:51 -0400 (0:00:00.016) 0:00:04.715 ****** 33192 1726883091.31523: entering _queue_task() for managed_node1/service 33192 1726883091.31687: worker is 1 (out of 1 available) 33192 1726883091.31700: exiting _queue_task() for managed_node1/service 33192 1726883091.31712: done queuing things up, now waiting for results queue to drain 33192 1726883091.31713: waiting for pending results... 33192 1726883091.31861: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 33192 1726883091.31944: in run() - task 0affe814-3a2d-6c15-6a7e-000000000070 33192 1726883091.31957: variable 'ansible_search_path' from source: unknown 33192 1726883091.31960: variable 'ansible_search_path' from source: unknown 33192 1726883091.31987: calling self._execute() 33192 1726883091.32047: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.32051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.32067: variable 'omit' from source: magic vars 33192 1726883091.32331: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.32342: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.32443: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.32448: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.32451: when evaluation is False, skipping this task 33192 1726883091.32457: _execute() done 33192 1726883091.32459: dumping result to json 33192 1726883091.32464: done dumping result, returning 33192 1726883091.32471: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affe814-3a2d-6c15-6a7e-000000000070] 33192 1726883091.32479: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000070 33192 1726883091.32572: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000070 33192 1726883091.32576: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883091.32638: no more pending results, returning what we have 33192 1726883091.32642: results queue empty 33192 1726883091.32643: checking for any_errors_fatal 33192 1726883091.32649: done checking for any_errors_fatal 33192 1726883091.32650: checking for max_fail_percentage 33192 1726883091.32652: done checking for max_fail_percentage 33192 1726883091.32653: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.32654: done checking to see if all hosts have failed 33192 1726883091.32655: getting the remaining hosts for this loop 33192 1726883091.32656: done getting the remaining hosts for this loop 33192 1726883091.32660: getting the next task for host managed_node1 33192 1726883091.32666: done getting next task for host managed_node1 33192 1726883091.32669: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 33192 1726883091.32673: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.32684: getting variables 33192 1726883091.32685: in VariableManager get_vars() 33192 1726883091.32715: Calling all_inventory to load vars for managed_node1 33192 1726883091.32717: Calling groups_inventory to load vars for managed_node1 33192 1726883091.32718: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.32724: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.32726: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.32729: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.32876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.33064: done with get_vars() 33192 1726883091.33072: done getting variables 33192 1726883091.33118: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:44:51 -0400 (0:00:00.016) 0:00:04.731 ****** 33192 1726883091.33142: entering _queue_task() for managed_node1/service 33192 1726883091.33308: worker is 1 (out of 1 available) 33192 1726883091.33322: exiting _queue_task() for managed_node1/service 33192 1726883091.33336: done queuing things up, now waiting for results queue to drain 33192 1726883091.33337: waiting for pending results... 33192 1726883091.33481: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 33192 1726883091.33560: in run() - task 0affe814-3a2d-6c15-6a7e-000000000071 33192 1726883091.33578: variable 'ansible_search_path' from source: unknown 33192 1726883091.33582: variable 'ansible_search_path' from source: unknown 33192 1726883091.33608: calling self._execute() 33192 1726883091.33675: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.33679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.33695: variable 'omit' from source: magic vars 33192 1726883091.33980: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.33991: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.34090: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.34093: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.34099: when evaluation is False, skipping this task 33192 1726883091.34102: _execute() done 33192 1726883091.34107: dumping result to json 33192 1726883091.34111: done dumping result, returning 33192 1726883091.34121: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0affe814-3a2d-6c15-6a7e-000000000071] 33192 1726883091.34124: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000071 33192 1726883091.34219: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000071 33192 1726883091.34222: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 33192 1726883091.34280: no more pending results, returning what we have 33192 1726883091.34283: results queue empty 33192 1726883091.34284: checking for any_errors_fatal 33192 1726883091.34289: done checking for any_errors_fatal 33192 1726883091.34290: checking for max_fail_percentage 33192 1726883091.34292: done checking for max_fail_percentage 33192 1726883091.34293: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.34294: done checking to see if all hosts have failed 33192 1726883091.34295: getting the remaining hosts for this loop 33192 1726883091.34297: done getting the remaining hosts for this loop 33192 1726883091.34300: getting the next task for host managed_node1 33192 1726883091.34306: done getting next task for host managed_node1 33192 1726883091.34310: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 33192 1726883091.34313: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.34326: getting variables 33192 1726883091.34328: in VariableManager get_vars() 33192 1726883091.34362: Calling all_inventory to load vars for managed_node1 33192 1726883091.34364: Calling groups_inventory to load vars for managed_node1 33192 1726883091.34365: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.34373: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.34375: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.34378: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.34555: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.34786: done with get_vars() 33192 1726883091.34796: done getting variables 33192 1726883091.34858: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:44:51 -0400 (0:00:00.017) 0:00:04.749 ****** 33192 1726883091.34894: entering _queue_task() for managed_node1/copy 33192 1726883091.35106: worker is 1 (out of 1 available) 33192 1726883091.35119: exiting _queue_task() for managed_node1/copy 33192 1726883091.35132: done queuing things up, now waiting for results queue to drain 33192 1726883091.35133: waiting for pending results... 33192 1726883091.35551: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 33192 1726883091.35557: in run() - task 0affe814-3a2d-6c15-6a7e-000000000072 33192 1726883091.35575: variable 'ansible_search_path' from source: unknown 33192 1726883091.35584: variable 'ansible_search_path' from source: unknown 33192 1726883091.35628: calling self._execute() 33192 1726883091.35722: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.35739: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.35758: variable 'omit' from source: magic vars 33192 1726883091.36140: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.36151: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.36252: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.36258: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.36260: when evaluation is False, skipping this task 33192 1726883091.36274: _execute() done 33192 1726883091.36278: dumping result to json 33192 1726883091.36280: done dumping result, returning 33192 1726883091.36287: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affe814-3a2d-6c15-6a7e-000000000072] 33192 1726883091.36292: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000072 33192 1726883091.36384: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000072 33192 1726883091.36387: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883091.36437: no more pending results, returning what we have 33192 1726883091.36440: results queue empty 33192 1726883091.36441: checking for any_errors_fatal 33192 1726883091.36447: done checking for any_errors_fatal 33192 1726883091.36448: checking for max_fail_percentage 33192 1726883091.36449: done checking for max_fail_percentage 33192 1726883091.36450: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.36452: done checking to see if all hosts have failed 33192 1726883091.36453: getting the remaining hosts for this loop 33192 1726883091.36454: done getting the remaining hosts for this loop 33192 1726883091.36457: getting the next task for host managed_node1 33192 1726883091.36463: done getting next task for host managed_node1 33192 1726883091.36466: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 33192 1726883091.36469: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.36485: getting variables 33192 1726883091.36487: in VariableManager get_vars() 33192 1726883091.36528: Calling all_inventory to load vars for managed_node1 33192 1726883091.36530: Calling groups_inventory to load vars for managed_node1 33192 1726883091.36531: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.36540: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.36542: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.36545: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.36690: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.36882: done with get_vars() 33192 1726883091.36890: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:44:51 -0400 (0:00:00.020) 0:00:04.770 ****** 33192 1726883091.36955: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 33192 1726883091.37122: worker is 1 (out of 1 available) 33192 1726883091.37138: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 33192 1726883091.37149: done queuing things up, now waiting for results queue to drain 33192 1726883091.37151: waiting for pending results... 33192 1726883091.37305: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 33192 1726883091.37392: in run() - task 0affe814-3a2d-6c15-6a7e-000000000073 33192 1726883091.37404: variable 'ansible_search_path' from source: unknown 33192 1726883091.37408: variable 'ansible_search_path' from source: unknown 33192 1726883091.37436: calling self._execute() 33192 1726883091.37501: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.37508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.37518: variable 'omit' from source: magic vars 33192 1726883091.37792: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.37801: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.37900: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.37904: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.37909: when evaluation is False, skipping this task 33192 1726883091.37912: _execute() done 33192 1726883091.37918: dumping result to json 33192 1726883091.37920: done dumping result, returning 33192 1726883091.37935: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affe814-3a2d-6c15-6a7e-000000000073] 33192 1726883091.37940: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000073 33192 1726883091.38026: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000073 33192 1726883091.38034: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883091.38088: no more pending results, returning what we have 33192 1726883091.38091: results queue empty 33192 1726883091.38093: checking for any_errors_fatal 33192 1726883091.38098: done checking for any_errors_fatal 33192 1726883091.38099: checking for max_fail_percentage 33192 1726883091.38101: done checking for max_fail_percentage 33192 1726883091.38102: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.38103: done checking to see if all hosts have failed 33192 1726883091.38104: getting the remaining hosts for this loop 33192 1726883091.38105: done getting the remaining hosts for this loop 33192 1726883091.38109: getting the next task for host managed_node1 33192 1726883091.38114: done getting next task for host managed_node1 33192 1726883091.38117: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 33192 1726883091.38120: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.38140: getting variables 33192 1726883091.38142: in VariableManager get_vars() 33192 1726883091.38176: Calling all_inventory to load vars for managed_node1 33192 1726883091.38178: Calling groups_inventory to load vars for managed_node1 33192 1726883091.38180: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.38187: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.38189: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.38191: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.38375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.38558: done with get_vars() 33192 1726883091.38566: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:44:51 -0400 (0:00:00.016) 0:00:04.786 ****** 33192 1726883091.38629: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 33192 1726883091.38793: worker is 1 (out of 1 available) 33192 1726883091.38808: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 33192 1726883091.38821: done queuing things up, now waiting for results queue to drain 33192 1726883091.38823: waiting for pending results... 33192 1726883091.39043: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 33192 1726883091.39180: in run() - task 0affe814-3a2d-6c15-6a7e-000000000074 33192 1726883091.39203: variable 'ansible_search_path' from source: unknown 33192 1726883091.39212: variable 'ansible_search_path' from source: unknown 33192 1726883091.39263: calling self._execute() 33192 1726883091.39537: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.39541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.39543: variable 'omit' from source: magic vars 33192 1726883091.39868: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.39887: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.40025: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.40039: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.40046: when evaluation is False, skipping this task 33192 1726883091.40053: _execute() done 33192 1726883091.40064: dumping result to json 33192 1726883091.40072: done dumping result, returning 33192 1726883091.40084: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affe814-3a2d-6c15-6a7e-000000000074] 33192 1726883091.40094: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000074 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883091.40247: no more pending results, returning what we have 33192 1726883091.40252: results queue empty 33192 1726883091.40253: checking for any_errors_fatal 33192 1726883091.40260: done checking for any_errors_fatal 33192 1726883091.40261: checking for max_fail_percentage 33192 1726883091.40262: done checking for max_fail_percentage 33192 1726883091.40264: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.40265: done checking to see if all hosts have failed 33192 1726883091.40266: getting the remaining hosts for this loop 33192 1726883091.40268: done getting the remaining hosts for this loop 33192 1726883091.40275: getting the next task for host managed_node1 33192 1726883091.40290: done getting next task for host managed_node1 33192 1726883091.40294: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 33192 1726883091.40298: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.40315: getting variables 33192 1726883091.40317: in VariableManager get_vars() 33192 1726883091.40360: Calling all_inventory to load vars for managed_node1 33192 1726883091.40363: Calling groups_inventory to load vars for managed_node1 33192 1726883091.40366: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.40377: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.40381: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.40386: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000074 33192 1726883091.40389: WORKER PROCESS EXITING 33192 1726883091.40395: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.40661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.41022: done with get_vars() 33192 1726883091.41033: done getting variables 33192 1726883091.41105: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:44:51 -0400 (0:00:00.025) 0:00:04.811 ****** 33192 1726883091.41143: entering _queue_task() for managed_node1/debug 33192 1726883091.41505: worker is 1 (out of 1 available) 33192 1726883091.41517: exiting _queue_task() for managed_node1/debug 33192 1726883091.41529: done queuing things up, now waiting for results queue to drain 33192 1726883091.41530: waiting for pending results... 33192 1726883091.41723: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 33192 1726883091.41929: in run() - task 0affe814-3a2d-6c15-6a7e-000000000075 33192 1726883091.41932: variable 'ansible_search_path' from source: unknown 33192 1726883091.41938: variable 'ansible_search_path' from source: unknown 33192 1726883091.41967: calling self._execute() 33192 1726883091.42067: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.42142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.42145: variable 'omit' from source: magic vars 33192 1726883091.42979: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.42996: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.43165: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.43186: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.43194: when evaluation is False, skipping this task 33192 1726883091.43202: _execute() done 33192 1726883091.43210: dumping result to json 33192 1726883091.43218: done dumping result, returning 33192 1726883091.43291: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affe814-3a2d-6c15-6a7e-000000000075] 33192 1726883091.43295: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000075 33192 1726883091.43365: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000075 33192 1726883091.43368: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 33192 1726883091.43439: no more pending results, returning what we have 33192 1726883091.43443: results queue empty 33192 1726883091.43449: checking for any_errors_fatal 33192 1726883091.43455: done checking for any_errors_fatal 33192 1726883091.43456: checking for max_fail_percentage 33192 1726883091.43459: done checking for max_fail_percentage 33192 1726883091.43460: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.43461: done checking to see if all hosts have failed 33192 1726883091.43462: getting the remaining hosts for this loop 33192 1726883091.43464: done getting the remaining hosts for this loop 33192 1726883091.43469: getting the next task for host managed_node1 33192 1726883091.43479: done getting next task for host managed_node1 33192 1726883091.43482: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 33192 1726883091.43486: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.43505: getting variables 33192 1726883091.43507: in VariableManager get_vars() 33192 1726883091.43648: Calling all_inventory to load vars for managed_node1 33192 1726883091.43652: Calling groups_inventory to load vars for managed_node1 33192 1726883091.43655: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.43743: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.43751: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.43756: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.44320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.44683: done with get_vars() 33192 1726883091.44693: done getting variables 33192 1726883091.44765: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:44:51 -0400 (0:00:00.036) 0:00:04.848 ****** 33192 1726883091.44800: entering _queue_task() for managed_node1/debug 33192 1726883091.45044: worker is 1 (out of 1 available) 33192 1726883091.45174: exiting _queue_task() for managed_node1/debug 33192 1726883091.45185: done queuing things up, now waiting for results queue to drain 33192 1726883091.45186: waiting for pending results... 33192 1726883091.45406: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 33192 1726883091.45610: in run() - task 0affe814-3a2d-6c15-6a7e-000000000076 33192 1726883091.45614: variable 'ansible_search_path' from source: unknown 33192 1726883091.45618: variable 'ansible_search_path' from source: unknown 33192 1726883091.45621: calling self._execute() 33192 1726883091.45716: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.45732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.45757: variable 'omit' from source: magic vars 33192 1726883091.46253: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.46265: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.46386: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.46398: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.46408: when evaluation is False, skipping this task 33192 1726883091.46416: _execute() done 33192 1726883091.46424: dumping result to json 33192 1726883091.46433: done dumping result, returning 33192 1726883091.46450: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affe814-3a2d-6c15-6a7e-000000000076] 33192 1726883091.46462: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000076 skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 33192 1726883091.46758: no more pending results, returning what we have 33192 1726883091.46762: results queue empty 33192 1726883091.46764: checking for any_errors_fatal 33192 1726883091.46774: done checking for any_errors_fatal 33192 1726883091.46775: checking for max_fail_percentage 33192 1726883091.46777: done checking for max_fail_percentage 33192 1726883091.46779: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.46780: done checking to see if all hosts have failed 33192 1726883091.46781: getting the remaining hosts for this loop 33192 1726883091.46783: done getting the remaining hosts for this loop 33192 1726883091.46788: getting the next task for host managed_node1 33192 1726883091.46795: done getting next task for host managed_node1 33192 1726883091.46799: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 33192 1726883091.46804: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.46825: getting variables 33192 1726883091.46827: in VariableManager get_vars() 33192 1726883091.46995: Calling all_inventory to load vars for managed_node1 33192 1726883091.46998: Calling groups_inventory to load vars for managed_node1 33192 1726883091.47001: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.47011: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000076 33192 1726883091.47014: WORKER PROCESS EXITING 33192 1726883091.47023: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.47027: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.47031: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.47314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.47708: done with get_vars() 33192 1726883091.47725: done getting variables 33192 1726883091.47801: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:44:51 -0400 (0:00:00.030) 0:00:04.878 ****** 33192 1726883091.47845: entering _queue_task() for managed_node1/debug 33192 1726883091.48207: worker is 1 (out of 1 available) 33192 1726883091.48220: exiting _queue_task() for managed_node1/debug 33192 1726883091.48232: done queuing things up, now waiting for results queue to drain 33192 1726883091.48236: waiting for pending results... 33192 1726883091.48439: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 33192 1726883091.48618: in run() - task 0affe814-3a2d-6c15-6a7e-000000000077 33192 1726883091.48645: variable 'ansible_search_path' from source: unknown 33192 1726883091.48653: variable 'ansible_search_path' from source: unknown 33192 1726883091.48708: calling self._execute() 33192 1726883091.48812: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.48826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.48847: variable 'omit' from source: magic vars 33192 1726883091.49355: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.49358: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.49513: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.49525: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.49536: when evaluation is False, skipping this task 33192 1726883091.49544: _execute() done 33192 1726883091.49573: dumping result to json 33192 1726883091.49577: done dumping result, returning 33192 1726883091.49579: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affe814-3a2d-6c15-6a7e-000000000077] 33192 1726883091.49623: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000077 skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 33192 1726883091.49838: no more pending results, returning what we have 33192 1726883091.49843: results queue empty 33192 1726883091.49845: checking for any_errors_fatal 33192 1726883091.49852: done checking for any_errors_fatal 33192 1726883091.49853: checking for max_fail_percentage 33192 1726883091.49855: done checking for max_fail_percentage 33192 1726883091.49857: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.49858: done checking to see if all hosts have failed 33192 1726883091.49859: getting the remaining hosts for this loop 33192 1726883091.49861: done getting the remaining hosts for this loop 33192 1726883091.49866: getting the next task for host managed_node1 33192 1726883091.49876: done getting next task for host managed_node1 33192 1726883091.49880: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 33192 1726883091.49885: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.49907: getting variables 33192 1726883091.49909: in VariableManager get_vars() 33192 1726883091.49963: Calling all_inventory to load vars for managed_node1 33192 1726883091.49967: Calling groups_inventory to load vars for managed_node1 33192 1726883091.49970: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.49985: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.49988: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.49992: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.50369: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000077 33192 1726883091.50375: WORKER PROCESS EXITING 33192 1726883091.50401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.50767: done with get_vars() 33192 1726883091.50781: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:44:51 -0400 (0:00:00.030) 0:00:04.909 ****** 33192 1726883091.50892: entering _queue_task() for managed_node1/ping 33192 1726883091.51248: worker is 1 (out of 1 available) 33192 1726883091.51259: exiting _queue_task() for managed_node1/ping 33192 1726883091.51269: done queuing things up, now waiting for results queue to drain 33192 1726883091.51273: waiting for pending results... 33192 1726883091.51454: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 33192 1726883091.51616: in run() - task 0affe814-3a2d-6c15-6a7e-000000000078 33192 1726883091.51638: variable 'ansible_search_path' from source: unknown 33192 1726883091.51646: variable 'ansible_search_path' from source: unknown 33192 1726883091.51698: calling self._execute() 33192 1726883091.51800: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.51813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.51833: variable 'omit' from source: magic vars 33192 1726883091.52284: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.52300: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.52468: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.52488: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.52496: when evaluation is False, skipping this task 33192 1726883091.52504: _execute() done 33192 1726883091.52512: dumping result to json 33192 1726883091.52520: done dumping result, returning 33192 1726883091.52531: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affe814-3a2d-6c15-6a7e-000000000078] 33192 1726883091.52656: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000078 33192 1726883091.52730: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000078 33192 1726883091.52736: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883091.52802: no more pending results, returning what we have 33192 1726883091.52807: results queue empty 33192 1726883091.52809: checking for any_errors_fatal 33192 1726883091.52820: done checking for any_errors_fatal 33192 1726883091.52821: checking for max_fail_percentage 33192 1726883091.52824: done checking for max_fail_percentage 33192 1726883091.52825: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.52826: done checking to see if all hosts have failed 33192 1726883091.52827: getting the remaining hosts for this loop 33192 1726883091.52829: done getting the remaining hosts for this loop 33192 1726883091.52836: getting the next task for host managed_node1 33192 1726883091.52847: done getting next task for host managed_node1 33192 1726883091.52850: ^ task is: TASK: meta (role_complete) 33192 1726883091.52854: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.52946: getting variables 33192 1726883091.52948: in VariableManager get_vars() 33192 1726883091.53054: Calling all_inventory to load vars for managed_node1 33192 1726883091.53058: Calling groups_inventory to load vars for managed_node1 33192 1726883091.53061: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.53073: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.53077: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.53081: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.53399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.53762: done with get_vars() 33192 1726883091.53775: done getting variables 33192 1726883091.53873: done queuing things up, now waiting for results queue to drain 33192 1726883091.53876: results queue empty 33192 1726883091.53877: checking for any_errors_fatal 33192 1726883091.53879: done checking for any_errors_fatal 33192 1726883091.53880: checking for max_fail_percentage 33192 1726883091.53881: done checking for max_fail_percentage 33192 1726883091.53882: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.53882: done checking to see if all hosts have failed 33192 1726883091.53883: getting the remaining hosts for this loop 33192 1726883091.53884: done getting the remaining hosts for this loop 33192 1726883091.53886: getting the next task for host managed_node1 33192 1726883091.53890: done getting next task for host managed_node1 33192 1726883091.53892: ^ task is: TASK: TEST: wireless connection with 802.1x TLS-EAP 33192 1726883091.53894: ^ state is: HOST STATE: block=3, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.53896: getting variables 33192 1726883091.53897: in VariableManager get_vars() 33192 1726883091.53914: Calling all_inventory to load vars for managed_node1 33192 1726883091.53916: Calling groups_inventory to load vars for managed_node1 33192 1726883091.53918: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.53923: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.53926: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.53928: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.54139: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.54478: done with get_vars() 33192 1726883091.54490: done getting variables 33192 1726883091.54546: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST: wireless connection with 802.1x TLS-EAP] *************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:53 Friday 20 September 2024 21:44:51 -0400 (0:00:00.036) 0:00:04.946 ****** 33192 1726883091.54576: entering _queue_task() for managed_node1/debug 33192 1726883091.54827: worker is 1 (out of 1 available) 33192 1726883091.54956: exiting _queue_task() for managed_node1/debug 33192 1726883091.54970: done queuing things up, now waiting for results queue to drain 33192 1726883091.54973: waiting for pending results... 33192 1726883091.55183: running TaskExecutor() for managed_node1/TASK: TEST: wireless connection with 802.1x TLS-EAP 33192 1726883091.55299: in run() - task 0affe814-3a2d-6c15-6a7e-0000000000a8 33192 1726883091.55381: variable 'ansible_search_path' from source: unknown 33192 1726883091.55389: calling self._execute() 33192 1726883091.55490: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.55510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.55529: variable 'omit' from source: magic vars 33192 1726883091.56032: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.56054: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.56212: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.56225: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.56252: when evaluation is False, skipping this task 33192 1726883091.56261: _execute() done 33192 1726883091.56264: dumping result to json 33192 1726883091.56361: done dumping result, returning 33192 1726883091.56366: done running TaskExecutor() for managed_node1/TASK: TEST: wireless connection with 802.1x TLS-EAP [0affe814-3a2d-6c15-6a7e-0000000000a8] 33192 1726883091.56369: sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000a8 33192 1726883091.56439: done sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000a8 33192 1726883091.56442: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 33192 1726883091.56502: no more pending results, returning what we have 33192 1726883091.56506: results queue empty 33192 1726883091.56508: checking for any_errors_fatal 33192 1726883091.56510: done checking for any_errors_fatal 33192 1726883091.56511: checking for max_fail_percentage 33192 1726883091.56512: done checking for max_fail_percentage 33192 1726883091.56514: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.56515: done checking to see if all hosts have failed 33192 1726883091.56516: getting the remaining hosts for this loop 33192 1726883091.56519: done getting the remaining hosts for this loop 33192 1726883091.56523: getting the next task for host managed_node1 33192 1726883091.56533: done getting next task for host managed_node1 33192 1726883091.56540: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 33192 1726883091.56545: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.56654: getting variables 33192 1726883091.56657: in VariableManager get_vars() 33192 1726883091.56718: Calling all_inventory to load vars for managed_node1 33192 1726883091.56722: Calling groups_inventory to load vars for managed_node1 33192 1726883091.56725: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.56798: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.56803: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.56808: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.57133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.57506: done with get_vars() 33192 1726883091.57518: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:44:51 -0400 (0:00:00.030) 0:00:04.976 ****** 33192 1726883091.57630: entering _queue_task() for managed_node1/include_tasks 33192 1726883091.57969: worker is 1 (out of 1 available) 33192 1726883091.57983: exiting _queue_task() for managed_node1/include_tasks 33192 1726883091.57994: done queuing things up, now waiting for results queue to drain 33192 1726883091.57996: waiting for pending results... 33192 1726883091.58351: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 33192 1726883091.58356: in run() - task 0affe814-3a2d-6c15-6a7e-0000000000b0 33192 1726883091.58359: variable 'ansible_search_path' from source: unknown 33192 1726883091.58380: variable 'ansible_search_path' from source: unknown 33192 1726883091.58420: calling self._execute() 33192 1726883091.58523: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.58539: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.58555: variable 'omit' from source: magic vars 33192 1726883091.58994: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.59012: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.59183: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.59195: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.59203: when evaluation is False, skipping this task 33192 1726883091.59243: _execute() done 33192 1726883091.59251: dumping result to json 33192 1726883091.59253: done dumping result, returning 33192 1726883091.59256: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affe814-3a2d-6c15-6a7e-0000000000b0] 33192 1726883091.59259: sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000b0 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883091.59513: no more pending results, returning what we have 33192 1726883091.59518: results queue empty 33192 1726883091.59519: checking for any_errors_fatal 33192 1726883091.59526: done checking for any_errors_fatal 33192 1726883091.59527: checking for max_fail_percentage 33192 1726883091.59529: done checking for max_fail_percentage 33192 1726883091.59530: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.59531: done checking to see if all hosts have failed 33192 1726883091.59532: getting the remaining hosts for this loop 33192 1726883091.59536: done getting the remaining hosts for this loop 33192 1726883091.59541: getting the next task for host managed_node1 33192 1726883091.59550: done getting next task for host managed_node1 33192 1726883091.59553: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 33192 1726883091.59557: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.59584: getting variables 33192 1726883091.59586: in VariableManager get_vars() 33192 1726883091.59753: Calling all_inventory to load vars for managed_node1 33192 1726883091.59756: Calling groups_inventory to load vars for managed_node1 33192 1726883091.59760: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.59767: done sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000b0 33192 1726883091.59770: WORKER PROCESS EXITING 33192 1726883091.59780: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.59784: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.59788: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.60052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.60423: done with get_vars() 33192 1726883091.60436: done getting variables 33192 1726883091.60503: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:44:51 -0400 (0:00:00.029) 0:00:05.005 ****** 33192 1726883091.60546: entering _queue_task() for managed_node1/debug 33192 1726883091.60777: worker is 1 (out of 1 available) 33192 1726883091.60789: exiting _queue_task() for managed_node1/debug 33192 1726883091.60800: done queuing things up, now waiting for results queue to drain 33192 1726883091.60802: waiting for pending results... 33192 1726883091.61081: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 33192 1726883091.61233: in run() - task 0affe814-3a2d-6c15-6a7e-0000000000b1 33192 1726883091.61262: variable 'ansible_search_path' from source: unknown 33192 1726883091.61284: variable 'ansible_search_path' from source: unknown 33192 1726883091.61370: calling self._execute() 33192 1726883091.61429: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.61446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.61462: variable 'omit' from source: magic vars 33192 1726883091.61898: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.61923: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.62097: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.62134: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.62147: when evaluation is False, skipping this task 33192 1726883091.62155: _execute() done 33192 1726883091.62158: dumping result to json 33192 1726883091.62160: done dumping result, returning 33192 1726883091.62240: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0affe814-3a2d-6c15-6a7e-0000000000b1] 33192 1726883091.62244: sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000b1 33192 1726883091.62322: done sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000b1 33192 1726883091.62325: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 33192 1726883091.62391: no more pending results, returning what we have 33192 1726883091.62395: results queue empty 33192 1726883091.62397: checking for any_errors_fatal 33192 1726883091.62402: done checking for any_errors_fatal 33192 1726883091.62404: checking for max_fail_percentage 33192 1726883091.62406: done checking for max_fail_percentage 33192 1726883091.62407: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.62408: done checking to see if all hosts have failed 33192 1726883091.62409: getting the remaining hosts for this loop 33192 1726883091.62412: done getting the remaining hosts for this loop 33192 1726883091.62416: getting the next task for host managed_node1 33192 1726883091.62424: done getting next task for host managed_node1 33192 1726883091.62429: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 33192 1726883091.62433: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.62459: getting variables 33192 1726883091.62461: in VariableManager get_vars() 33192 1726883091.62627: Calling all_inventory to load vars for managed_node1 33192 1726883091.62631: Calling groups_inventory to load vars for managed_node1 33192 1726883091.62695: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.62706: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.62709: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.62713: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.63045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.63403: done with get_vars() 33192 1726883091.63421: done getting variables 33192 1726883091.63495: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:44:51 -0400 (0:00:00.029) 0:00:05.035 ****** 33192 1726883091.63537: entering _queue_task() for managed_node1/fail 33192 1726883091.63887: worker is 1 (out of 1 available) 33192 1726883091.63900: exiting _queue_task() for managed_node1/fail 33192 1726883091.63911: done queuing things up, now waiting for results queue to drain 33192 1726883091.63913: waiting for pending results... 33192 1726883091.64155: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 33192 1726883091.64363: in run() - task 0affe814-3a2d-6c15-6a7e-0000000000b2 33192 1726883091.64368: variable 'ansible_search_path' from source: unknown 33192 1726883091.64373: variable 'ansible_search_path' from source: unknown 33192 1726883091.64377: calling self._execute() 33192 1726883091.64456: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.64482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.64504: variable 'omit' from source: magic vars 33192 1726883091.64967: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.64989: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.65162: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.65178: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.65187: when evaluation is False, skipping this task 33192 1726883091.65238: _execute() done 33192 1726883091.65244: dumping result to json 33192 1726883091.65247: done dumping result, returning 33192 1726883091.65251: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affe814-3a2d-6c15-6a7e-0000000000b2] 33192 1726883091.65253: sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000b2 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883091.65511: no more pending results, returning what we have 33192 1726883091.65515: results queue empty 33192 1726883091.65516: checking for any_errors_fatal 33192 1726883091.65523: done checking for any_errors_fatal 33192 1726883091.65524: checking for max_fail_percentage 33192 1726883091.65526: done checking for max_fail_percentage 33192 1726883091.65527: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.65529: done checking to see if all hosts have failed 33192 1726883091.65530: getting the remaining hosts for this loop 33192 1726883091.65532: done getting the remaining hosts for this loop 33192 1726883091.65538: getting the next task for host managed_node1 33192 1726883091.65544: done getting next task for host managed_node1 33192 1726883091.65548: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 33192 1726883091.65552: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.65577: getting variables 33192 1726883091.65579: in VariableManager get_vars() 33192 1726883091.65627: Calling all_inventory to load vars for managed_node1 33192 1726883091.65631: Calling groups_inventory to load vars for managed_node1 33192 1726883091.65730: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.65740: done sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000b2 33192 1726883091.65743: WORKER PROCESS EXITING 33192 1726883091.65759: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.65763: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.65767: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.66028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.66384: done with get_vars() 33192 1726883091.66396: done getting variables 33192 1726883091.66469: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:44:51 -0400 (0:00:00.029) 0:00:05.065 ****** 33192 1726883091.66506: entering _queue_task() for managed_node1/fail 33192 1726883091.66851: worker is 1 (out of 1 available) 33192 1726883091.66862: exiting _queue_task() for managed_node1/fail 33192 1726883091.66875: done queuing things up, now waiting for results queue to drain 33192 1726883091.66877: waiting for pending results... 33192 1726883091.67053: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 33192 1726883091.67222: in run() - task 0affe814-3a2d-6c15-6a7e-0000000000b3 33192 1726883091.67246: variable 'ansible_search_path' from source: unknown 33192 1726883091.67262: variable 'ansible_search_path' from source: unknown 33192 1726883091.67313: calling self._execute() 33192 1726883091.67414: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.67428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.67446: variable 'omit' from source: magic vars 33192 1726883091.67908: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.67911: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.68076: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.68125: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.68128: when evaluation is False, skipping this task 33192 1726883091.68131: _execute() done 33192 1726883091.68135: dumping result to json 33192 1726883091.68138: done dumping result, returning 33192 1726883091.68140: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affe814-3a2d-6c15-6a7e-0000000000b3] 33192 1726883091.68144: sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000b3 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883091.68427: no more pending results, returning what we have 33192 1726883091.68430: results queue empty 33192 1726883091.68432: checking for any_errors_fatal 33192 1726883091.68440: done checking for any_errors_fatal 33192 1726883091.68441: checking for max_fail_percentage 33192 1726883091.68443: done checking for max_fail_percentage 33192 1726883091.68444: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.68445: done checking to see if all hosts have failed 33192 1726883091.68446: getting the remaining hosts for this loop 33192 1726883091.68448: done getting the remaining hosts for this loop 33192 1726883091.68452: getting the next task for host managed_node1 33192 1726883091.68458: done getting next task for host managed_node1 33192 1726883091.68462: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 33192 1726883091.68465: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.68499: getting variables 33192 1726883091.68501: in VariableManager get_vars() 33192 1726883091.68551: Calling all_inventory to load vars for managed_node1 33192 1726883091.68554: Calling groups_inventory to load vars for managed_node1 33192 1726883091.68557: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.68569: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.68575: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.68579: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.68670: done sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000b3 33192 1726883091.68676: WORKER PROCESS EXITING 33192 1726883091.68982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.69327: done with get_vars() 33192 1726883091.69340: done getting variables 33192 1726883091.69409: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:44:51 -0400 (0:00:00.029) 0:00:05.094 ****** 33192 1726883091.69444: entering _queue_task() for managed_node1/fail 33192 1726883091.69726: worker is 1 (out of 1 available) 33192 1726883091.69759: exiting _queue_task() for managed_node1/fail 33192 1726883091.69774: done queuing things up, now waiting for results queue to drain 33192 1726883091.69776: waiting for pending results... 33192 1726883091.69939: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 33192 1726883091.70039: in run() - task 0affe814-3a2d-6c15-6a7e-0000000000b4 33192 1726883091.70054: variable 'ansible_search_path' from source: unknown 33192 1726883091.70058: variable 'ansible_search_path' from source: unknown 33192 1726883091.70092: calling self._execute() 33192 1726883091.70164: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.70173: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.70181: variable 'omit' from source: magic vars 33192 1726883091.70490: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.70500: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.70601: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.70605: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.70610: when evaluation is False, skipping this task 33192 1726883091.70613: _execute() done 33192 1726883091.70618: dumping result to json 33192 1726883091.70623: done dumping result, returning 33192 1726883091.70631: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affe814-3a2d-6c15-6a7e-0000000000b4] 33192 1726883091.70634: sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000b4 33192 1726883091.70729: done sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000b4 33192 1726883091.70732: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883091.70802: no more pending results, returning what we have 33192 1726883091.70805: results queue empty 33192 1726883091.70806: checking for any_errors_fatal 33192 1726883091.70812: done checking for any_errors_fatal 33192 1726883091.70813: checking for max_fail_percentage 33192 1726883091.70815: done checking for max_fail_percentage 33192 1726883091.70816: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.70817: done checking to see if all hosts have failed 33192 1726883091.70818: getting the remaining hosts for this loop 33192 1726883091.70819: done getting the remaining hosts for this loop 33192 1726883091.70823: getting the next task for host managed_node1 33192 1726883091.70829: done getting next task for host managed_node1 33192 1726883091.70832: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 33192 1726883091.70837: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.70852: getting variables 33192 1726883091.70853: in VariableManager get_vars() 33192 1726883091.70889: Calling all_inventory to load vars for managed_node1 33192 1726883091.70891: Calling groups_inventory to load vars for managed_node1 33192 1726883091.70893: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.70899: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.70901: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.70903: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.71052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.71246: done with get_vars() 33192 1726883091.71254: done getting variables 33192 1726883091.71305: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:44:51 -0400 (0:00:00.018) 0:00:05.113 ****** 33192 1726883091.71327: entering _queue_task() for managed_node1/dnf 33192 1726883091.71499: worker is 1 (out of 1 available) 33192 1726883091.71513: exiting _queue_task() for managed_node1/dnf 33192 1726883091.71525: done queuing things up, now waiting for results queue to drain 33192 1726883091.71526: waiting for pending results... 33192 1726883091.71684: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 33192 1726883091.71775: in run() - task 0affe814-3a2d-6c15-6a7e-0000000000b5 33192 1726883091.71787: variable 'ansible_search_path' from source: unknown 33192 1726883091.71795: variable 'ansible_search_path' from source: unknown 33192 1726883091.71838: calling self._execute() 33192 1726883091.72039: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.72043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.72046: variable 'omit' from source: magic vars 33192 1726883091.72385: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.72404: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.72546: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.72559: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.72568: when evaluation is False, skipping this task 33192 1726883091.72580: _execute() done 33192 1726883091.72588: dumping result to json 33192 1726883091.72597: done dumping result, returning 33192 1726883091.72608: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affe814-3a2d-6c15-6a7e-0000000000b5] 33192 1726883091.72617: sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000b5 33192 1726883091.72841: done sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000b5 33192 1726883091.72844: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883091.72890: no more pending results, returning what we have 33192 1726883091.72894: results queue empty 33192 1726883091.72895: checking for any_errors_fatal 33192 1726883091.72901: done checking for any_errors_fatal 33192 1726883091.72902: checking for max_fail_percentage 33192 1726883091.72904: done checking for max_fail_percentage 33192 1726883091.72905: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.72906: done checking to see if all hosts have failed 33192 1726883091.72907: getting the remaining hosts for this loop 33192 1726883091.72908: done getting the remaining hosts for this loop 33192 1726883091.72912: getting the next task for host managed_node1 33192 1726883091.72918: done getting next task for host managed_node1 33192 1726883091.72922: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 33192 1726883091.72925: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.72946: getting variables 33192 1726883091.72948: in VariableManager get_vars() 33192 1726883091.72996: Calling all_inventory to load vars for managed_node1 33192 1726883091.73000: Calling groups_inventory to load vars for managed_node1 33192 1726883091.73002: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.73011: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.73014: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.73018: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.73219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.73412: done with get_vars() 33192 1726883091.73420: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 33192 1726883091.73480: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:44:51 -0400 (0:00:00.021) 0:00:05.135 ****** 33192 1726883091.73501: entering _queue_task() for managed_node1/yum 33192 1726883091.73664: worker is 1 (out of 1 available) 33192 1726883091.73676: exiting _queue_task() for managed_node1/yum 33192 1726883091.73688: done queuing things up, now waiting for results queue to drain 33192 1726883091.73689: waiting for pending results... 33192 1726883091.73850: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 33192 1726883091.73942: in run() - task 0affe814-3a2d-6c15-6a7e-0000000000b6 33192 1726883091.73955: variable 'ansible_search_path' from source: unknown 33192 1726883091.73959: variable 'ansible_search_path' from source: unknown 33192 1726883091.73991: calling self._execute() 33192 1726883091.74058: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.74065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.74075: variable 'omit' from source: magic vars 33192 1726883091.74370: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.74383: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.74479: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.74487: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.74490: when evaluation is False, skipping this task 33192 1726883091.74495: _execute() done 33192 1726883091.74498: dumping result to json 33192 1726883091.74503: done dumping result, returning 33192 1726883091.74509: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affe814-3a2d-6c15-6a7e-0000000000b6] 33192 1726883091.74515: sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000b6 33192 1726883091.74644: done sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000b6 33192 1726883091.74647: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883091.74709: no more pending results, returning what we have 33192 1726883091.74711: results queue empty 33192 1726883091.74712: checking for any_errors_fatal 33192 1726883091.74715: done checking for any_errors_fatal 33192 1726883091.74716: checking for max_fail_percentage 33192 1726883091.74717: done checking for max_fail_percentage 33192 1726883091.74718: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.74718: done checking to see if all hosts have failed 33192 1726883091.74719: getting the remaining hosts for this loop 33192 1726883091.74720: done getting the remaining hosts for this loop 33192 1726883091.74723: getting the next task for host managed_node1 33192 1726883091.74726: done getting next task for host managed_node1 33192 1726883091.74729: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 33192 1726883091.74731: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.74744: getting variables 33192 1726883091.74745: in VariableManager get_vars() 33192 1726883091.74779: Calling all_inventory to load vars for managed_node1 33192 1726883091.74781: Calling groups_inventory to load vars for managed_node1 33192 1726883091.74783: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.74789: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.74791: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.74794: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.74943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.75150: done with get_vars() 33192 1726883091.75157: done getting variables 33192 1726883091.75216: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:44:51 -0400 (0:00:00.017) 0:00:05.152 ****** 33192 1726883091.75241: entering _queue_task() for managed_node1/fail 33192 1726883091.75445: worker is 1 (out of 1 available) 33192 1726883091.75457: exiting _queue_task() for managed_node1/fail 33192 1726883091.75469: done queuing things up, now waiting for results queue to drain 33192 1726883091.75470: waiting for pending results... 33192 1726883091.75855: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 33192 1726883091.75868: in run() - task 0affe814-3a2d-6c15-6a7e-0000000000b7 33192 1726883091.75890: variable 'ansible_search_path' from source: unknown 33192 1726883091.75900: variable 'ansible_search_path' from source: unknown 33192 1726883091.75947: calling self._execute() 33192 1726883091.76041: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.76055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.76074: variable 'omit' from source: magic vars 33192 1726883091.76479: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.76504: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.76650: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.76663: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.76672: when evaluation is False, skipping this task 33192 1726883091.76716: _execute() done 33192 1726883091.76720: dumping result to json 33192 1726883091.76722: done dumping result, returning 33192 1726883091.76725: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-6c15-6a7e-0000000000b7] 33192 1726883091.76728: sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000b7 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883091.76987: no more pending results, returning what we have 33192 1726883091.76991: results queue empty 33192 1726883091.76992: checking for any_errors_fatal 33192 1726883091.76998: done checking for any_errors_fatal 33192 1726883091.76999: checking for max_fail_percentage 33192 1726883091.77001: done checking for max_fail_percentage 33192 1726883091.77002: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.77003: done checking to see if all hosts have failed 33192 1726883091.77004: getting the remaining hosts for this loop 33192 1726883091.77006: done getting the remaining hosts for this loop 33192 1726883091.77010: getting the next task for host managed_node1 33192 1726883091.77016: done getting next task for host managed_node1 33192 1726883091.77020: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 33192 1726883091.77023: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.77044: getting variables 33192 1726883091.77046: in VariableManager get_vars() 33192 1726883091.77093: Calling all_inventory to load vars for managed_node1 33192 1726883091.77097: Calling groups_inventory to load vars for managed_node1 33192 1726883091.77100: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.77110: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.77114: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.77118: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.77405: done sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000b7 33192 1726883091.77408: WORKER PROCESS EXITING 33192 1726883091.77436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.77763: done with get_vars() 33192 1726883091.77774: done getting variables 33192 1726883091.77837: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:44:51 -0400 (0:00:00.026) 0:00:05.179 ****** 33192 1726883091.77870: entering _queue_task() for managed_node1/package 33192 1726883091.78078: worker is 1 (out of 1 available) 33192 1726883091.78092: exiting _queue_task() for managed_node1/package 33192 1726883091.78104: done queuing things up, now waiting for results queue to drain 33192 1726883091.78105: waiting for pending results... 33192 1726883091.78364: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 33192 1726883091.78504: in run() - task 0affe814-3a2d-6c15-6a7e-0000000000b8 33192 1726883091.78528: variable 'ansible_search_path' from source: unknown 33192 1726883091.78541: variable 'ansible_search_path' from source: unknown 33192 1726883091.78585: calling self._execute() 33192 1726883091.78672: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.78685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.78699: variable 'omit' from source: magic vars 33192 1726883091.79090: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.79112: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.79265: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.79277: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.79286: when evaluation is False, skipping this task 33192 1726883091.79295: _execute() done 33192 1726883091.79302: dumping result to json 33192 1726883091.79310: done dumping result, returning 33192 1726883091.79326: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0affe814-3a2d-6c15-6a7e-0000000000b8] 33192 1726883091.79337: sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000b8 33192 1726883091.79602: done sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000b8 33192 1726883091.79605: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883091.79651: no more pending results, returning what we have 33192 1726883091.79654: results queue empty 33192 1726883091.79655: checking for any_errors_fatal 33192 1726883091.79662: done checking for any_errors_fatal 33192 1726883091.79663: checking for max_fail_percentage 33192 1726883091.79665: done checking for max_fail_percentage 33192 1726883091.79666: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.79667: done checking to see if all hosts have failed 33192 1726883091.79668: getting the remaining hosts for this loop 33192 1726883091.79670: done getting the remaining hosts for this loop 33192 1726883091.79673: getting the next task for host managed_node1 33192 1726883091.79679: done getting next task for host managed_node1 33192 1726883091.79683: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 33192 1726883091.79686: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.79704: getting variables 33192 1726883091.79705: in VariableManager get_vars() 33192 1726883091.79752: Calling all_inventory to load vars for managed_node1 33192 1726883091.79755: Calling groups_inventory to load vars for managed_node1 33192 1726883091.79759: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.79768: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.79771: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.79775: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.80061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.80392: done with get_vars() 33192 1726883091.80403: done getting variables 33192 1726883091.80464: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:44:51 -0400 (0:00:00.026) 0:00:05.205 ****** 33192 1726883091.80498: entering _queue_task() for managed_node1/package 33192 1726883091.80711: worker is 1 (out of 1 available) 33192 1726883091.80724: exiting _queue_task() for managed_node1/package 33192 1726883091.80739: done queuing things up, now waiting for results queue to drain 33192 1726883091.80740: waiting for pending results... 33192 1726883091.80998: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 33192 1726883091.81340: in run() - task 0affe814-3a2d-6c15-6a7e-0000000000b9 33192 1726883091.81344: variable 'ansible_search_path' from source: unknown 33192 1726883091.81348: variable 'ansible_search_path' from source: unknown 33192 1726883091.81351: calling self._execute() 33192 1726883091.81353: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.81356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.81358: variable 'omit' from source: magic vars 33192 1726883091.81723: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.81744: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.81894: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.81912: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.81920: when evaluation is False, skipping this task 33192 1726883091.81928: _execute() done 33192 1726883091.81938: dumping result to json 33192 1726883091.81946: done dumping result, returning 33192 1726883091.81959: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affe814-3a2d-6c15-6a7e-0000000000b9] 33192 1726883091.81970: sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000b9 33192 1726883091.82240: done sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000b9 33192 1726883091.82244: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883091.82284: no more pending results, returning what we have 33192 1726883091.82288: results queue empty 33192 1726883091.82289: checking for any_errors_fatal 33192 1726883091.82294: done checking for any_errors_fatal 33192 1726883091.82295: checking for max_fail_percentage 33192 1726883091.82297: done checking for max_fail_percentage 33192 1726883091.82298: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.82300: done checking to see if all hosts have failed 33192 1726883091.82301: getting the remaining hosts for this loop 33192 1726883091.82303: done getting the remaining hosts for this loop 33192 1726883091.82306: getting the next task for host managed_node1 33192 1726883091.82312: done getting next task for host managed_node1 33192 1726883091.82316: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 33192 1726883091.82319: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.82339: getting variables 33192 1726883091.82341: in VariableManager get_vars() 33192 1726883091.82385: Calling all_inventory to load vars for managed_node1 33192 1726883091.82388: Calling groups_inventory to load vars for managed_node1 33192 1726883091.82391: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.82400: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.82404: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.82408: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.82663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.82990: done with get_vars() 33192 1726883091.83002: done getting variables 33192 1726883091.83066: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:44:51 -0400 (0:00:00.025) 0:00:05.231 ****** 33192 1726883091.83100: entering _queue_task() for managed_node1/package 33192 1726883091.83311: worker is 1 (out of 1 available) 33192 1726883091.83324: exiting _queue_task() for managed_node1/package 33192 1726883091.83336: done queuing things up, now waiting for results queue to drain 33192 1726883091.83338: waiting for pending results... 33192 1726883091.83599: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 33192 1726883091.83755: in run() - task 0affe814-3a2d-6c15-6a7e-0000000000ba 33192 1726883091.83781: variable 'ansible_search_path' from source: unknown 33192 1726883091.83791: variable 'ansible_search_path' from source: unknown 33192 1726883091.83835: calling self._execute() 33192 1726883091.84139: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.84143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.84147: variable 'omit' from source: magic vars 33192 1726883091.84346: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.84366: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.84516: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.84528: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.84540: when evaluation is False, skipping this task 33192 1726883091.84549: _execute() done 33192 1726883091.84558: dumping result to json 33192 1726883091.84567: done dumping result, returning 33192 1726883091.84590: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affe814-3a2d-6c15-6a7e-0000000000ba] 33192 1726883091.84593: sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000ba skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883091.84747: no more pending results, returning what we have 33192 1726883091.84752: results queue empty 33192 1726883091.84754: checking for any_errors_fatal 33192 1726883091.84763: done checking for any_errors_fatal 33192 1726883091.84764: checking for max_fail_percentage 33192 1726883091.84766: done checking for max_fail_percentage 33192 1726883091.84767: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.84768: done checking to see if all hosts have failed 33192 1726883091.84769: getting the remaining hosts for this loop 33192 1726883091.84771: done getting the remaining hosts for this loop 33192 1726883091.84776: getting the next task for host managed_node1 33192 1726883091.84783: done getting next task for host managed_node1 33192 1726883091.84787: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 33192 1726883091.84792: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.84816: getting variables 33192 1726883091.84818: in VariableManager get_vars() 33192 1726883091.84872: Calling all_inventory to load vars for managed_node1 33192 1726883091.84875: Calling groups_inventory to load vars for managed_node1 33192 1726883091.84878: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.84893: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.84897: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.84901: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.85352: done sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000ba 33192 1726883091.85355: WORKER PROCESS EXITING 33192 1726883091.85381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.85705: done with get_vars() 33192 1726883091.85716: done getting variables 33192 1726883091.85780: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:44:51 -0400 (0:00:00.027) 0:00:05.258 ****** 33192 1726883091.85815: entering _queue_task() for managed_node1/service 33192 1726883091.86023: worker is 1 (out of 1 available) 33192 1726883091.86038: exiting _queue_task() for managed_node1/service 33192 1726883091.86049: done queuing things up, now waiting for results queue to drain 33192 1726883091.86051: waiting for pending results... 33192 1726883091.86309: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 33192 1726883091.86464: in run() - task 0affe814-3a2d-6c15-6a7e-0000000000bb 33192 1726883091.86488: variable 'ansible_search_path' from source: unknown 33192 1726883091.86497: variable 'ansible_search_path' from source: unknown 33192 1726883091.86544: calling self._execute() 33192 1726883091.86636: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.86655: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.86672: variable 'omit' from source: magic vars 33192 1726883091.87063: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.87086: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.87232: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.87247: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.87256: when evaluation is False, skipping this task 33192 1726883091.87264: _execute() done 33192 1726883091.87294: dumping result to json 33192 1726883091.87298: done dumping result, returning 33192 1726883091.87301: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-6c15-6a7e-0000000000bb] 33192 1726883091.87305: sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000bb skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883091.87463: no more pending results, returning what we have 33192 1726883091.87467: results queue empty 33192 1726883091.87469: checking for any_errors_fatal 33192 1726883091.87474: done checking for any_errors_fatal 33192 1726883091.87475: checking for max_fail_percentage 33192 1726883091.87477: done checking for max_fail_percentage 33192 1726883091.87478: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.87480: done checking to see if all hosts have failed 33192 1726883091.87481: getting the remaining hosts for this loop 33192 1726883091.87483: done getting the remaining hosts for this loop 33192 1726883091.87487: getting the next task for host managed_node1 33192 1726883091.87495: done getting next task for host managed_node1 33192 1726883091.87498: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 33192 1726883091.87503: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.87525: getting variables 33192 1726883091.87527: in VariableManager get_vars() 33192 1726883091.87580: Calling all_inventory to load vars for managed_node1 33192 1726883091.87583: Calling groups_inventory to load vars for managed_node1 33192 1726883091.87586: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.87598: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.87602: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.87607: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.88010: done sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000bb 33192 1726883091.88013: WORKER PROCESS EXITING 33192 1726883091.88041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.88395: done with get_vars() 33192 1726883091.88405: done getting variables 33192 1726883091.88497: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:44:51 -0400 (0:00:00.027) 0:00:05.285 ****** 33192 1726883091.88528: entering _queue_task() for managed_node1/service 33192 1726883091.88733: worker is 1 (out of 1 available) 33192 1726883091.88746: exiting _queue_task() for managed_node1/service 33192 1726883091.88756: done queuing things up, now waiting for results queue to drain 33192 1726883091.88757: waiting for pending results... 33192 1726883091.89153: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 33192 1726883091.89159: in run() - task 0affe814-3a2d-6c15-6a7e-0000000000bc 33192 1726883091.89167: variable 'ansible_search_path' from source: unknown 33192 1726883091.89175: variable 'ansible_search_path' from source: unknown 33192 1726883091.89216: calling self._execute() 33192 1726883091.89308: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.89323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.89344: variable 'omit' from source: magic vars 33192 1726883091.89812: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.89830: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.89987: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.90066: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.90074: when evaluation is False, skipping this task 33192 1726883091.90125: _execute() done 33192 1726883091.90135: dumping result to json 33192 1726883091.90145: done dumping result, returning 33192 1726883091.90168: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affe814-3a2d-6c15-6a7e-0000000000bc] 33192 1726883091.90180: sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000bc skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 33192 1726883091.90327: no more pending results, returning what we have 33192 1726883091.90331: results queue empty 33192 1726883091.90332: checking for any_errors_fatal 33192 1726883091.90341: done checking for any_errors_fatal 33192 1726883091.90342: checking for max_fail_percentage 33192 1726883091.90344: done checking for max_fail_percentage 33192 1726883091.90345: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.90346: done checking to see if all hosts have failed 33192 1726883091.90347: getting the remaining hosts for this loop 33192 1726883091.90349: done getting the remaining hosts for this loop 33192 1726883091.90354: getting the next task for host managed_node1 33192 1726883091.90362: done getting next task for host managed_node1 33192 1726883091.90366: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 33192 1726883091.90370: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.90393: getting variables 33192 1726883091.90396: in VariableManager get_vars() 33192 1726883091.90551: Calling all_inventory to load vars for managed_node1 33192 1726883091.90554: Calling groups_inventory to load vars for managed_node1 33192 1726883091.90557: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.90569: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.90572: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.90576: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.90947: done sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000bc 33192 1726883091.90951: WORKER PROCESS EXITING 33192 1726883091.90976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.91300: done with get_vars() 33192 1726883091.91312: done getting variables 33192 1726883091.91374: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:44:51 -0400 (0:00:00.028) 0:00:05.314 ****** 33192 1726883091.91407: entering _queue_task() for managed_node1/service 33192 1726883091.91627: worker is 1 (out of 1 available) 33192 1726883091.91842: exiting _queue_task() for managed_node1/service 33192 1726883091.91853: done queuing things up, now waiting for results queue to drain 33192 1726883091.91854: waiting for pending results... 33192 1726883091.91914: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 33192 1726883091.92068: in run() - task 0affe814-3a2d-6c15-6a7e-0000000000bd 33192 1726883091.92096: variable 'ansible_search_path' from source: unknown 33192 1726883091.92106: variable 'ansible_search_path' from source: unknown 33192 1726883091.92151: calling self._execute() 33192 1726883091.92243: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.92261: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.92279: variable 'omit' from source: magic vars 33192 1726883091.92681: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.92700: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.92854: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.92866: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.92874: when evaluation is False, skipping this task 33192 1726883091.92882: _execute() done 33192 1726883091.92889: dumping result to json 33192 1726883091.92897: done dumping result, returning 33192 1726883091.92908: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affe814-3a2d-6c15-6a7e-0000000000bd] 33192 1726883091.92919: sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000bd skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883091.93106: no more pending results, returning what we have 33192 1726883091.93110: results queue empty 33192 1726883091.93112: checking for any_errors_fatal 33192 1726883091.93118: done checking for any_errors_fatal 33192 1726883091.93120: checking for max_fail_percentage 33192 1726883091.93122: done checking for max_fail_percentage 33192 1726883091.93123: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.93124: done checking to see if all hosts have failed 33192 1726883091.93125: getting the remaining hosts for this loop 33192 1726883091.93128: done getting the remaining hosts for this loop 33192 1726883091.93133: getting the next task for host managed_node1 33192 1726883091.93143: done getting next task for host managed_node1 33192 1726883091.93147: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 33192 1726883091.93151: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.93172: getting variables 33192 1726883091.93174: in VariableManager get_vars() 33192 1726883091.93225: Calling all_inventory to load vars for managed_node1 33192 1726883091.93228: Calling groups_inventory to load vars for managed_node1 33192 1726883091.93231: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.93397: done sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000bd 33192 1726883091.93401: WORKER PROCESS EXITING 33192 1726883091.93410: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.93414: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.93418: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.93661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.94027: done with get_vars() 33192 1726883091.94040: done getting variables 33192 1726883091.94101: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:44:51 -0400 (0:00:00.027) 0:00:05.341 ****** 33192 1726883091.94339: entering _queue_task() for managed_node1/service 33192 1726883091.94666: worker is 1 (out of 1 available) 33192 1726883091.94679: exiting _queue_task() for managed_node1/service 33192 1726883091.94689: done queuing things up, now waiting for results queue to drain 33192 1726883091.94691: waiting for pending results... 33192 1726883091.95131: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 33192 1726883091.95315: in run() - task 0affe814-3a2d-6c15-6a7e-0000000000be 33192 1726883091.95337: variable 'ansible_search_path' from source: unknown 33192 1726883091.95477: variable 'ansible_search_path' from source: unknown 33192 1726883091.95482: calling self._execute() 33192 1726883091.95547: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.95561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.95587: variable 'omit' from source: magic vars 33192 1726883091.96054: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.96127: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.96249: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.96261: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.96269: when evaluation is False, skipping this task 33192 1726883091.96281: _execute() done 33192 1726883091.96289: dumping result to json 33192 1726883091.96297: done dumping result, returning 33192 1726883091.96308: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0affe814-3a2d-6c15-6a7e-0000000000be] 33192 1726883091.96319: sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000be skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 33192 1726883091.96482: no more pending results, returning what we have 33192 1726883091.96486: results queue empty 33192 1726883091.96488: checking for any_errors_fatal 33192 1726883091.96494: done checking for any_errors_fatal 33192 1726883091.96495: checking for max_fail_percentage 33192 1726883091.96497: done checking for max_fail_percentage 33192 1726883091.96498: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.96499: done checking to see if all hosts have failed 33192 1726883091.96500: getting the remaining hosts for this loop 33192 1726883091.96502: done getting the remaining hosts for this loop 33192 1726883091.96506: getting the next task for host managed_node1 33192 1726883091.96513: done getting next task for host managed_node1 33192 1726883091.96517: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 33192 1726883091.96521: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.96546: getting variables 33192 1726883091.96548: in VariableManager get_vars() 33192 1726883091.96593: Calling all_inventory to load vars for managed_node1 33192 1726883091.96596: Calling groups_inventory to load vars for managed_node1 33192 1726883091.96599: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.96608: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.96611: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.96614: Calling groups_plugins_play to load vars for managed_node1 33192 1726883091.96885: done sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000be 33192 1726883091.96888: WORKER PROCESS EXITING 33192 1726883091.96914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883091.97287: done with get_vars() 33192 1726883091.97310: done getting variables 33192 1726883091.97377: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:44:51 -0400 (0:00:00.032) 0:00:05.374 ****** 33192 1726883091.97424: entering _queue_task() for managed_node1/copy 33192 1726883091.97699: worker is 1 (out of 1 available) 33192 1726883091.97712: exiting _queue_task() for managed_node1/copy 33192 1726883091.97724: done queuing things up, now waiting for results queue to drain 33192 1726883091.97725: waiting for pending results... 33192 1726883091.98013: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 33192 1726883091.98190: in run() - task 0affe814-3a2d-6c15-6a7e-0000000000bf 33192 1726883091.98210: variable 'ansible_search_path' from source: unknown 33192 1726883091.98218: variable 'ansible_search_path' from source: unknown 33192 1726883091.98262: calling self._execute() 33192 1726883091.98365: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883091.98389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883091.98411: variable 'omit' from source: magic vars 33192 1726883091.98863: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.98937: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883091.99058: variable 'ansible_distribution_major_version' from source: facts 33192 1726883091.99070: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883091.99080: when evaluation is False, skipping this task 33192 1726883091.99088: _execute() done 33192 1726883091.99096: dumping result to json 33192 1726883091.99104: done dumping result, returning 33192 1726883091.99115: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affe814-3a2d-6c15-6a7e-0000000000bf] 33192 1726883091.99125: sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000bf skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883091.99320: no more pending results, returning what we have 33192 1726883091.99325: results queue empty 33192 1726883091.99326: checking for any_errors_fatal 33192 1726883091.99333: done checking for any_errors_fatal 33192 1726883091.99336: checking for max_fail_percentage 33192 1726883091.99338: done checking for max_fail_percentage 33192 1726883091.99339: checking to see if all hosts have failed and the running result is not ok 33192 1726883091.99341: done checking to see if all hosts have failed 33192 1726883091.99342: getting the remaining hosts for this loop 33192 1726883091.99344: done getting the remaining hosts for this loop 33192 1726883091.99348: getting the next task for host managed_node1 33192 1726883091.99356: done getting next task for host managed_node1 33192 1726883091.99360: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 33192 1726883091.99376: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883091.99397: getting variables 33192 1726883091.99400: in VariableManager get_vars() 33192 1726883091.99608: Calling all_inventory to load vars for managed_node1 33192 1726883091.99611: Calling groups_inventory to load vars for managed_node1 33192 1726883091.99614: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883091.99624: Calling all_plugins_play to load vars for managed_node1 33192 1726883091.99627: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883091.99631: Calling groups_plugins_play to load vars for managed_node1 33192 1726883092.00211: done sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000bf 33192 1726883092.00214: WORKER PROCESS EXITING 33192 1726883092.00259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883092.00625: done with get_vars() 33192 1726883092.00637: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:44:52 -0400 (0:00:00.033) 0:00:05.407 ****** 33192 1726883092.00743: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 33192 1726883092.01000: worker is 1 (out of 1 available) 33192 1726883092.01144: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 33192 1726883092.01155: done queuing things up, now waiting for results queue to drain 33192 1726883092.01156: waiting for pending results... 33192 1726883092.01327: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 33192 1726883092.01507: in run() - task 0affe814-3a2d-6c15-6a7e-0000000000c0 33192 1726883092.01536: variable 'ansible_search_path' from source: unknown 33192 1726883092.01546: variable 'ansible_search_path' from source: unknown 33192 1726883092.01605: calling self._execute() 33192 1726883092.01714: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883092.01728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883092.01749: variable 'omit' from source: magic vars 33192 1726883092.02228: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.02251: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883092.02415: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.02431: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883092.02449: when evaluation is False, skipping this task 33192 1726883092.02523: _execute() done 33192 1726883092.02527: dumping result to json 33192 1726883092.02530: done dumping result, returning 33192 1726883092.02533: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affe814-3a2d-6c15-6a7e-0000000000c0] 33192 1726883092.02541: sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000c0 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883092.02909: no more pending results, returning what we have 33192 1726883092.02912: results queue empty 33192 1726883092.02914: checking for any_errors_fatal 33192 1726883092.02920: done checking for any_errors_fatal 33192 1726883092.02921: checking for max_fail_percentage 33192 1726883092.02923: done checking for max_fail_percentage 33192 1726883092.02924: checking to see if all hosts have failed and the running result is not ok 33192 1726883092.02925: done checking to see if all hosts have failed 33192 1726883092.02926: getting the remaining hosts for this loop 33192 1726883092.02928: done getting the remaining hosts for this loop 33192 1726883092.02932: getting the next task for host managed_node1 33192 1726883092.02939: done getting next task for host managed_node1 33192 1726883092.02943: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 33192 1726883092.02948: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883092.02965: getting variables 33192 1726883092.02967: in VariableManager get_vars() 33192 1726883092.03014: Calling all_inventory to load vars for managed_node1 33192 1726883092.03017: Calling groups_inventory to load vars for managed_node1 33192 1726883092.03020: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883092.03029: Calling all_plugins_play to load vars for managed_node1 33192 1726883092.03033: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883092.03249: Calling groups_plugins_play to load vars for managed_node1 33192 1726883092.03815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883092.04376: done with get_vars() 33192 1726883092.04387: done getting variables 33192 1726883092.04410: done sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000c0 33192 1726883092.04414: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:44:52 -0400 (0:00:00.037) 0:00:05.445 ****** 33192 1726883092.04501: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 33192 1726883092.04828: worker is 1 (out of 1 available) 33192 1726883092.04846: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 33192 1726883092.05145: done queuing things up, now waiting for results queue to drain 33192 1726883092.05147: waiting for pending results... 33192 1726883092.05466: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 33192 1726883092.05918: in run() - task 0affe814-3a2d-6c15-6a7e-0000000000c1 33192 1726883092.05921: variable 'ansible_search_path' from source: unknown 33192 1726883092.05924: variable 'ansible_search_path' from source: unknown 33192 1726883092.06136: calling self._execute() 33192 1726883092.06139: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883092.06143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883092.06289: variable 'omit' from source: magic vars 33192 1726883092.07827: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.07852: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883092.08163: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.08443: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883092.08446: when evaluation is False, skipping this task 33192 1726883092.08449: _execute() done 33192 1726883092.08452: dumping result to json 33192 1726883092.08454: done dumping result, returning 33192 1726883092.08457: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affe814-3a2d-6c15-6a7e-0000000000c1] 33192 1726883092.08460: sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000c1 33192 1726883092.08537: done sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000c1 33192 1726883092.08542: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883092.08605: no more pending results, returning what we have 33192 1726883092.08610: results queue empty 33192 1726883092.08612: checking for any_errors_fatal 33192 1726883092.08622: done checking for any_errors_fatal 33192 1726883092.08624: checking for max_fail_percentage 33192 1726883092.08625: done checking for max_fail_percentage 33192 1726883092.08626: checking to see if all hosts have failed and the running result is not ok 33192 1726883092.08628: done checking to see if all hosts have failed 33192 1726883092.08629: getting the remaining hosts for this loop 33192 1726883092.08631: done getting the remaining hosts for this loop 33192 1726883092.08638: getting the next task for host managed_node1 33192 1726883092.08645: done getting next task for host managed_node1 33192 1726883092.08649: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 33192 1726883092.08653: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883092.08676: getting variables 33192 1726883092.08678: in VariableManager get_vars() 33192 1726883092.08731: Calling all_inventory to load vars for managed_node1 33192 1726883092.08740: Calling groups_inventory to load vars for managed_node1 33192 1726883092.08743: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883092.08757: Calling all_plugins_play to load vars for managed_node1 33192 1726883092.08760: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883092.08764: Calling groups_plugins_play to load vars for managed_node1 33192 1726883092.09385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883092.10161: done with get_vars() 33192 1726883092.10173: done getting variables 33192 1726883092.10254: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:44:52 -0400 (0:00:00.057) 0:00:05.503 ****** 33192 1726883092.10292: entering _queue_task() for managed_node1/debug 33192 1726883092.10917: worker is 1 (out of 1 available) 33192 1726883092.10931: exiting _queue_task() for managed_node1/debug 33192 1726883092.11112: done queuing things up, now waiting for results queue to drain 33192 1726883092.11113: waiting for pending results... 33192 1726883092.11429: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 33192 1726883092.11703: in run() - task 0affe814-3a2d-6c15-6a7e-0000000000c2 33192 1726883092.11941: variable 'ansible_search_path' from source: unknown 33192 1726883092.11945: variable 'ansible_search_path' from source: unknown 33192 1726883092.11948: calling self._execute() 33192 1726883092.12200: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883092.12204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883092.12207: variable 'omit' from source: magic vars 33192 1726883092.13065: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.13184: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883092.13399: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.13448: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883092.13457: when evaluation is False, skipping this task 33192 1726883092.13465: _execute() done 33192 1726883092.13472: dumping result to json 33192 1726883092.13538: done dumping result, returning 33192 1726883092.13542: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affe814-3a2d-6c15-6a7e-0000000000c2] 33192 1726883092.13545: sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000c2 33192 1726883092.13903: done sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000c2 33192 1726883092.13906: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 33192 1726883092.13961: no more pending results, returning what we have 33192 1726883092.13966: results queue empty 33192 1726883092.13967: checking for any_errors_fatal 33192 1726883092.13973: done checking for any_errors_fatal 33192 1726883092.13974: checking for max_fail_percentage 33192 1726883092.13976: done checking for max_fail_percentage 33192 1726883092.13977: checking to see if all hosts have failed and the running result is not ok 33192 1726883092.13978: done checking to see if all hosts have failed 33192 1726883092.13979: getting the remaining hosts for this loop 33192 1726883092.13981: done getting the remaining hosts for this loop 33192 1726883092.13986: getting the next task for host managed_node1 33192 1726883092.13993: done getting next task for host managed_node1 33192 1726883092.13998: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 33192 1726883092.14002: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883092.14024: getting variables 33192 1726883092.14026: in VariableManager get_vars() 33192 1726883092.14081: Calling all_inventory to load vars for managed_node1 33192 1726883092.14085: Calling groups_inventory to load vars for managed_node1 33192 1726883092.14088: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883092.14101: Calling all_plugins_play to load vars for managed_node1 33192 1726883092.14105: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883092.14109: Calling groups_plugins_play to load vars for managed_node1 33192 1726883092.14867: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883092.15543: done with get_vars() 33192 1726883092.15625: done getting variables 33192 1726883092.15812: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:44:52 -0400 (0:00:00.055) 0:00:05.558 ****** 33192 1726883092.15851: entering _queue_task() for managed_node1/debug 33192 1726883092.16701: worker is 1 (out of 1 available) 33192 1726883092.16714: exiting _queue_task() for managed_node1/debug 33192 1726883092.16726: done queuing things up, now waiting for results queue to drain 33192 1726883092.16727: waiting for pending results... 33192 1726883092.17836: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 33192 1726883092.18253: in run() - task 0affe814-3a2d-6c15-6a7e-0000000000c3 33192 1726883092.18259: variable 'ansible_search_path' from source: unknown 33192 1726883092.18262: variable 'ansible_search_path' from source: unknown 33192 1726883092.18265: calling self._execute() 33192 1726883092.18759: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883092.18771: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883092.19022: variable 'omit' from source: magic vars 33192 1726883092.20408: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.20645: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883092.21077: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.21175: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883092.21248: when evaluation is False, skipping this task 33192 1726883092.21259: _execute() done 33192 1726883092.21650: dumping result to json 33192 1726883092.21656: done dumping result, returning 33192 1726883092.21660: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affe814-3a2d-6c15-6a7e-0000000000c3] 33192 1726883092.21663: sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000c3 33192 1726883092.21745: done sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000c3 33192 1726883092.21839: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 33192 1726883092.21899: no more pending results, returning what we have 33192 1726883092.21904: results queue empty 33192 1726883092.21906: checking for any_errors_fatal 33192 1726883092.21912: done checking for any_errors_fatal 33192 1726883092.21913: checking for max_fail_percentage 33192 1726883092.21915: done checking for max_fail_percentage 33192 1726883092.21917: checking to see if all hosts have failed and the running result is not ok 33192 1726883092.21918: done checking to see if all hosts have failed 33192 1726883092.21919: getting the remaining hosts for this loop 33192 1726883092.21921: done getting the remaining hosts for this loop 33192 1726883092.21927: getting the next task for host managed_node1 33192 1726883092.21937: done getting next task for host managed_node1 33192 1726883092.21942: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 33192 1726883092.21946: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883092.21970: getting variables 33192 1726883092.21973: in VariableManager get_vars() 33192 1726883092.22032: Calling all_inventory to load vars for managed_node1 33192 1726883092.22240: Calling groups_inventory to load vars for managed_node1 33192 1726883092.22245: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883092.22268: Calling all_plugins_play to load vars for managed_node1 33192 1726883092.22273: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883092.22278: Calling groups_plugins_play to load vars for managed_node1 33192 1726883092.23382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883092.23969: done with get_vars() 33192 1726883092.23983: done getting variables 33192 1726883092.24172: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:44:52 -0400 (0:00:00.083) 0:00:05.642 ****** 33192 1726883092.24211: entering _queue_task() for managed_node1/debug 33192 1726883092.24824: worker is 1 (out of 1 available) 33192 1726883092.25145: exiting _queue_task() for managed_node1/debug 33192 1726883092.25157: done queuing things up, now waiting for results queue to drain 33192 1726883092.25158: waiting for pending results... 33192 1726883092.25759: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 33192 1726883092.26241: in run() - task 0affe814-3a2d-6c15-6a7e-0000000000c4 33192 1726883092.26245: variable 'ansible_search_path' from source: unknown 33192 1726883092.26248: variable 'ansible_search_path' from source: unknown 33192 1726883092.26251: calling self._execute() 33192 1726883092.26398: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883092.26644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883092.26647: variable 'omit' from source: magic vars 33192 1726883092.27492: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.27511: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883092.27686: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.27707: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883092.27715: when evaluation is False, skipping this task 33192 1726883092.27726: _execute() done 33192 1726883092.27737: dumping result to json 33192 1726883092.27747: done dumping result, returning 33192 1726883092.27762: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affe814-3a2d-6c15-6a7e-0000000000c4] 33192 1726883092.27778: sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000c4 skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 33192 1726883092.27957: no more pending results, returning what we have 33192 1726883092.27961: results queue empty 33192 1726883092.27963: checking for any_errors_fatal 33192 1726883092.27971: done checking for any_errors_fatal 33192 1726883092.27972: checking for max_fail_percentage 33192 1726883092.27973: done checking for max_fail_percentage 33192 1726883092.27974: checking to see if all hosts have failed and the running result is not ok 33192 1726883092.27975: done checking to see if all hosts have failed 33192 1726883092.27976: getting the remaining hosts for this loop 33192 1726883092.27978: done getting the remaining hosts for this loop 33192 1726883092.27983: getting the next task for host managed_node1 33192 1726883092.27991: done getting next task for host managed_node1 33192 1726883092.27995: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 33192 1726883092.27999: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883092.28022: getting variables 33192 1726883092.28024: in VariableManager get_vars() 33192 1726883092.28075: Calling all_inventory to load vars for managed_node1 33192 1726883092.28078: Calling groups_inventory to load vars for managed_node1 33192 1726883092.28082: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883092.28092: Calling all_plugins_play to load vars for managed_node1 33192 1726883092.28095: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883092.28099: Calling groups_plugins_play to load vars for managed_node1 33192 1726883092.28389: done sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000c4 33192 1726883092.28393: WORKER PROCESS EXITING 33192 1726883092.28410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883092.28785: done with get_vars() 33192 1726883092.28802: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:44:52 -0400 (0:00:00.047) 0:00:05.689 ****** 33192 1726883092.28915: entering _queue_task() for managed_node1/ping 33192 1726883092.29265: worker is 1 (out of 1 available) 33192 1726883092.29277: exiting _queue_task() for managed_node1/ping 33192 1726883092.29290: done queuing things up, now waiting for results queue to drain 33192 1726883092.29292: waiting for pending results... 33192 1726883092.30023: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 33192 1726883092.30541: in run() - task 0affe814-3a2d-6c15-6a7e-0000000000c5 33192 1726883092.30546: variable 'ansible_search_path' from source: unknown 33192 1726883092.30549: variable 'ansible_search_path' from source: unknown 33192 1726883092.30553: calling self._execute() 33192 1726883092.30816: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883092.31239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883092.31242: variable 'omit' from source: magic vars 33192 1726883092.31831: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.31859: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883092.32015: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.32027: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883092.32040: when evaluation is False, skipping this task 33192 1726883092.32049: _execute() done 33192 1726883092.32065: dumping result to json 33192 1726883092.32075: done dumping result, returning 33192 1726883092.32087: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affe814-3a2d-6c15-6a7e-0000000000c5] 33192 1726883092.32097: sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000c5 33192 1726883092.32344: done sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000c5 33192 1726883092.32348: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883092.32398: no more pending results, returning what we have 33192 1726883092.32402: results queue empty 33192 1726883092.32404: checking for any_errors_fatal 33192 1726883092.32413: done checking for any_errors_fatal 33192 1726883092.32415: checking for max_fail_percentage 33192 1726883092.32417: done checking for max_fail_percentage 33192 1726883092.32418: checking to see if all hosts have failed and the running result is not ok 33192 1726883092.32419: done checking to see if all hosts have failed 33192 1726883092.32420: getting the remaining hosts for this loop 33192 1726883092.32421: done getting the remaining hosts for this loop 33192 1726883092.32426: getting the next task for host managed_node1 33192 1726883092.32436: done getting next task for host managed_node1 33192 1726883092.32439: ^ task is: TASK: meta (role_complete) 33192 1726883092.32442: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883092.32463: getting variables 33192 1726883092.32465: in VariableManager get_vars() 33192 1726883092.32517: Calling all_inventory to load vars for managed_node1 33192 1726883092.32520: Calling groups_inventory to load vars for managed_node1 33192 1726883092.32523: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883092.32540: Calling all_plugins_play to load vars for managed_node1 33192 1726883092.32545: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883092.32550: Calling groups_plugins_play to load vars for managed_node1 33192 1726883092.32925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883092.33339: done with get_vars() 33192 1726883092.33352: done getting variables 33192 1726883092.33449: done queuing things up, now waiting for results queue to drain 33192 1726883092.33451: results queue empty 33192 1726883092.33452: checking for any_errors_fatal 33192 1726883092.33454: done checking for any_errors_fatal 33192 1726883092.33455: checking for max_fail_percentage 33192 1726883092.33456: done checking for max_fail_percentage 33192 1726883092.33457: checking to see if all hosts have failed and the running result is not ok 33192 1726883092.33458: done checking to see if all hosts have failed 33192 1726883092.33459: getting the remaining hosts for this loop 33192 1726883092.33460: done getting the remaining hosts for this loop 33192 1726883092.33462: getting the next task for host managed_node1 33192 1726883092.33469: done getting next task for host managed_node1 33192 1726883092.33472: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 33192 1726883092.33475: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33192 1726883092.33487: getting variables 33192 1726883092.33488: in VariableManager get_vars() 33192 1726883092.33513: Calling all_inventory to load vars for managed_node1 33192 1726883092.33521: Calling groups_inventory to load vars for managed_node1 33192 1726883092.33524: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883092.33530: Calling all_plugins_play to load vars for managed_node1 33192 1726883092.33533: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883092.33539: Calling groups_plugins_play to load vars for managed_node1 33192 1726883092.33791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883092.34126: done with get_vars() 33192 1726883092.34139: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:44:52 -0400 (0:00:00.053) 0:00:05.742 ****** 33192 1726883092.34241: entering _queue_task() for managed_node1/include_tasks 33192 1726883092.34553: worker is 1 (out of 1 available) 33192 1726883092.34565: exiting _queue_task() for managed_node1/include_tasks 33192 1726883092.34578: done queuing things up, now waiting for results queue to drain 33192 1726883092.34579: waiting for pending results... 33192 1726883092.35085: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 33192 1726883092.35257: in run() - task 0affe814-3a2d-6c15-6a7e-0000000000fd 33192 1726883092.35287: variable 'ansible_search_path' from source: unknown 33192 1726883092.35296: variable 'ansible_search_path' from source: unknown 33192 1726883092.35341: calling self._execute() 33192 1726883092.35446: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883092.35490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883092.35494: variable 'omit' from source: magic vars 33192 1726883092.35939: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.36035: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883092.36113: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.36126: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883092.36139: when evaluation is False, skipping this task 33192 1726883092.36151: _execute() done 33192 1726883092.36239: dumping result to json 33192 1726883092.36243: done dumping result, returning 33192 1726883092.36251: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affe814-3a2d-6c15-6a7e-0000000000fd] 33192 1726883092.36254: sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000fd 33192 1726883092.36329: done sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000fd 33192 1726883092.36332: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883092.36388: no more pending results, returning what we have 33192 1726883092.36393: results queue empty 33192 1726883092.36395: checking for any_errors_fatal 33192 1726883092.36396: done checking for any_errors_fatal 33192 1726883092.36397: checking for max_fail_percentage 33192 1726883092.36399: done checking for max_fail_percentage 33192 1726883092.36400: checking to see if all hosts have failed and the running result is not ok 33192 1726883092.36401: done checking to see if all hosts have failed 33192 1726883092.36402: getting the remaining hosts for this loop 33192 1726883092.36404: done getting the remaining hosts for this loop 33192 1726883092.36408: getting the next task for host managed_node1 33192 1726883092.36417: done getting next task for host managed_node1 33192 1726883092.36421: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 33192 1726883092.36427: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33192 1726883092.36648: getting variables 33192 1726883092.36650: in VariableManager get_vars() 33192 1726883092.36696: Calling all_inventory to load vars for managed_node1 33192 1726883092.36700: Calling groups_inventory to load vars for managed_node1 33192 1726883092.36703: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883092.36712: Calling all_plugins_play to load vars for managed_node1 33192 1726883092.36715: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883092.36719: Calling groups_plugins_play to load vars for managed_node1 33192 1726883092.36976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883092.37330: done with get_vars() 33192 1726883092.37343: done getting variables 33192 1726883092.37412: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:44:52 -0400 (0:00:00.032) 0:00:05.774 ****** 33192 1726883092.37450: entering _queue_task() for managed_node1/debug 33192 1726883092.37686: worker is 1 (out of 1 available) 33192 1726883092.37699: exiting _queue_task() for managed_node1/debug 33192 1726883092.37712: done queuing things up, now waiting for results queue to drain 33192 1726883092.37713: waiting for pending results... 33192 1726883092.38159: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 33192 1726883092.38164: in run() - task 0affe814-3a2d-6c15-6a7e-0000000000fe 33192 1726883092.38169: variable 'ansible_search_path' from source: unknown 33192 1726883092.38176: variable 'ansible_search_path' from source: unknown 33192 1726883092.38221: calling self._execute() 33192 1726883092.38321: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883092.38336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883092.38361: variable 'omit' from source: magic vars 33192 1726883092.38828: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.38831: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883092.38978: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.38992: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883092.39000: when evaluation is False, skipping this task 33192 1726883092.39013: _execute() done 33192 1726883092.39040: dumping result to json 33192 1726883092.39043: done dumping result, returning 33192 1726883092.39046: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0affe814-3a2d-6c15-6a7e-0000000000fe] 33192 1726883092.39053: sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000fe 33192 1726883092.39275: done sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000fe 33192 1726883092.39278: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 33192 1726883092.39414: no more pending results, returning what we have 33192 1726883092.39419: results queue empty 33192 1726883092.39420: checking for any_errors_fatal 33192 1726883092.39432: done checking for any_errors_fatal 33192 1726883092.39435: checking for max_fail_percentage 33192 1726883092.39437: done checking for max_fail_percentage 33192 1726883092.39438: checking to see if all hosts have failed and the running result is not ok 33192 1726883092.39439: done checking to see if all hosts have failed 33192 1726883092.39440: getting the remaining hosts for this loop 33192 1726883092.39442: done getting the remaining hosts for this loop 33192 1726883092.39450: getting the next task for host managed_node1 33192 1726883092.39456: done getting next task for host managed_node1 33192 1726883092.39460: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 33192 1726883092.39465: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33192 1726883092.39484: getting variables 33192 1726883092.39486: in VariableManager get_vars() 33192 1726883092.39530: Calling all_inventory to load vars for managed_node1 33192 1726883092.39535: Calling groups_inventory to load vars for managed_node1 33192 1726883092.39545: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883092.39564: Calling all_plugins_play to load vars for managed_node1 33192 1726883092.39568: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883092.39572: Calling groups_plugins_play to load vars for managed_node1 33192 1726883092.40021: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883092.40637: done with get_vars() 33192 1726883092.40745: done getting variables 33192 1726883092.40897: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:44:52 -0400 (0:00:00.034) 0:00:05.809 ****** 33192 1726883092.40932: entering _queue_task() for managed_node1/fail 33192 1726883092.41438: worker is 1 (out of 1 available) 33192 1726883092.41451: exiting _queue_task() for managed_node1/fail 33192 1726883092.41463: done queuing things up, now waiting for results queue to drain 33192 1726883092.41464: waiting for pending results... 33192 1726883092.41762: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 33192 1726883092.41921: in run() - task 0affe814-3a2d-6c15-6a7e-0000000000ff 33192 1726883092.41947: variable 'ansible_search_path' from source: unknown 33192 1726883092.41964: variable 'ansible_search_path' from source: unknown 33192 1726883092.42006: calling self._execute() 33192 1726883092.42104: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883092.42118: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883092.42171: variable 'omit' from source: magic vars 33192 1726883092.42556: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.42575: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883092.42736: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.42749: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883092.42757: when evaluation is False, skipping this task 33192 1726883092.42833: _execute() done 33192 1726883092.42838: dumping result to json 33192 1726883092.42840: done dumping result, returning 33192 1726883092.42843: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affe814-3a2d-6c15-6a7e-0000000000ff] 33192 1726883092.42846: sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000ff 33192 1726883092.42915: done sending task result for task 0affe814-3a2d-6c15-6a7e-0000000000ff 33192 1726883092.42918: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883092.42973: no more pending results, returning what we have 33192 1726883092.42977: results queue empty 33192 1726883092.42978: checking for any_errors_fatal 33192 1726883092.42986: done checking for any_errors_fatal 33192 1726883092.42987: checking for max_fail_percentage 33192 1726883092.42989: done checking for max_fail_percentage 33192 1726883092.42990: checking to see if all hosts have failed and the running result is not ok 33192 1726883092.42991: done checking to see if all hosts have failed 33192 1726883092.42992: getting the remaining hosts for this loop 33192 1726883092.42994: done getting the remaining hosts for this loop 33192 1726883092.42999: getting the next task for host managed_node1 33192 1726883092.43007: done getting next task for host managed_node1 33192 1726883092.43010: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 33192 1726883092.43016: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33192 1726883092.43039: getting variables 33192 1726883092.43042: in VariableManager get_vars() 33192 1726883092.43189: Calling all_inventory to load vars for managed_node1 33192 1726883092.43192: Calling groups_inventory to load vars for managed_node1 33192 1726883092.43195: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883092.43206: Calling all_plugins_play to load vars for managed_node1 33192 1726883092.43210: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883092.43214: Calling groups_plugins_play to load vars for managed_node1 33192 1726883092.43721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883092.44150: done with get_vars() 33192 1726883092.44163: done getting variables 33192 1726883092.44226: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:44:52 -0400 (0:00:00.033) 0:00:05.843 ****** 33192 1726883092.44281: entering _queue_task() for managed_node1/fail 33192 1726883092.44530: worker is 1 (out of 1 available) 33192 1726883092.44547: exiting _queue_task() for managed_node1/fail 33192 1726883092.44678: done queuing things up, now waiting for results queue to drain 33192 1726883092.44680: waiting for pending results... 33192 1726883092.45012: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 33192 1726883092.45077: in run() - task 0affe814-3a2d-6c15-6a7e-000000000100 33192 1726883092.45177: variable 'ansible_search_path' from source: unknown 33192 1726883092.45181: variable 'ansible_search_path' from source: unknown 33192 1726883092.45224: calling self._execute() 33192 1726883092.45371: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883092.45387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883092.45404: variable 'omit' from source: magic vars 33192 1726883092.45730: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.45742: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883092.45843: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.45848: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883092.45851: when evaluation is False, skipping this task 33192 1726883092.45857: _execute() done 33192 1726883092.45860: dumping result to json 33192 1726883092.45868: done dumping result, returning 33192 1726883092.45874: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affe814-3a2d-6c15-6a7e-000000000100] 33192 1726883092.45878: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000100 33192 1726883092.45975: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000100 33192 1726883092.45979: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883092.46025: no more pending results, returning what we have 33192 1726883092.46028: results queue empty 33192 1726883092.46029: checking for any_errors_fatal 33192 1726883092.46037: done checking for any_errors_fatal 33192 1726883092.46038: checking for max_fail_percentage 33192 1726883092.46039: done checking for max_fail_percentage 33192 1726883092.46040: checking to see if all hosts have failed and the running result is not ok 33192 1726883092.46041: done checking to see if all hosts have failed 33192 1726883092.46042: getting the remaining hosts for this loop 33192 1726883092.46044: done getting the remaining hosts for this loop 33192 1726883092.46047: getting the next task for host managed_node1 33192 1726883092.46053: done getting next task for host managed_node1 33192 1726883092.46056: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 33192 1726883092.46061: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33192 1726883092.46080: getting variables 33192 1726883092.46082: in VariableManager get_vars() 33192 1726883092.46116: Calling all_inventory to load vars for managed_node1 33192 1726883092.46118: Calling groups_inventory to load vars for managed_node1 33192 1726883092.46120: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883092.46126: Calling all_plugins_play to load vars for managed_node1 33192 1726883092.46128: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883092.46130: Calling groups_plugins_play to load vars for managed_node1 33192 1726883092.46317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883092.46504: done with get_vars() 33192 1726883092.46512: done getting variables 33192 1726883092.46560: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:44:52 -0400 (0:00:00.023) 0:00:05.866 ****** 33192 1726883092.46587: entering _queue_task() for managed_node1/fail 33192 1726883092.46766: worker is 1 (out of 1 available) 33192 1726883092.46782: exiting _queue_task() for managed_node1/fail 33192 1726883092.46795: done queuing things up, now waiting for results queue to drain 33192 1726883092.46796: waiting for pending results... 33192 1726883092.46955: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 33192 1726883092.47054: in run() - task 0affe814-3a2d-6c15-6a7e-000000000101 33192 1726883092.47066: variable 'ansible_search_path' from source: unknown 33192 1726883092.47069: variable 'ansible_search_path' from source: unknown 33192 1726883092.47100: calling self._execute() 33192 1726883092.47167: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883092.47176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883092.47185: variable 'omit' from source: magic vars 33192 1726883092.47550: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.47556: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883092.47675: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.47679: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883092.47683: when evaluation is False, skipping this task 33192 1726883092.47687: _execute() done 33192 1726883092.47690: dumping result to json 33192 1726883092.47703: done dumping result, returning 33192 1726883092.47707: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affe814-3a2d-6c15-6a7e-000000000101] 33192 1726883092.47709: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000101 33192 1726883092.47807: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000101 33192 1726883092.47812: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883092.47949: no more pending results, returning what we have 33192 1726883092.47953: results queue empty 33192 1726883092.47954: checking for any_errors_fatal 33192 1726883092.47959: done checking for any_errors_fatal 33192 1726883092.47960: checking for max_fail_percentage 33192 1726883092.47961: done checking for max_fail_percentage 33192 1726883092.47962: checking to see if all hosts have failed and the running result is not ok 33192 1726883092.47963: done checking to see if all hosts have failed 33192 1726883092.47964: getting the remaining hosts for this loop 33192 1726883092.47966: done getting the remaining hosts for this loop 33192 1726883092.47969: getting the next task for host managed_node1 33192 1726883092.47975: done getting next task for host managed_node1 33192 1726883092.47979: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 33192 1726883092.47983: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33192 1726883092.47999: getting variables 33192 1726883092.48001: in VariableManager get_vars() 33192 1726883092.48048: Calling all_inventory to load vars for managed_node1 33192 1726883092.48051: Calling groups_inventory to load vars for managed_node1 33192 1726883092.48054: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883092.48063: Calling all_plugins_play to load vars for managed_node1 33192 1726883092.48066: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883092.48070: Calling groups_plugins_play to load vars for managed_node1 33192 1726883092.48326: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883092.48695: done with get_vars() 33192 1726883092.48707: done getting variables 33192 1726883092.48769: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:44:52 -0400 (0:00:00.022) 0:00:05.888 ****** 33192 1726883092.48815: entering _queue_task() for managed_node1/dnf 33192 1726883092.49257: worker is 1 (out of 1 available) 33192 1726883092.49267: exiting _queue_task() for managed_node1/dnf 33192 1726883092.49284: done queuing things up, now waiting for results queue to drain 33192 1726883092.49285: waiting for pending results... 33192 1726883092.49524: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 33192 1726883092.49528: in run() - task 0affe814-3a2d-6c15-6a7e-000000000102 33192 1726883092.49536: variable 'ansible_search_path' from source: unknown 33192 1726883092.49539: variable 'ansible_search_path' from source: unknown 33192 1726883092.49580: calling self._execute() 33192 1726883092.49684: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883092.49699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883092.49714: variable 'omit' from source: magic vars 33192 1726883092.50173: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.50184: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883092.50301: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.50305: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883092.50309: when evaluation is False, skipping this task 33192 1726883092.50312: _execute() done 33192 1726883092.50314: dumping result to json 33192 1726883092.50317: done dumping result, returning 33192 1726883092.50320: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affe814-3a2d-6c15-6a7e-000000000102] 33192 1726883092.50323: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000102 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883092.50501: no more pending results, returning what we have 33192 1726883092.50506: results queue empty 33192 1726883092.50508: checking for any_errors_fatal 33192 1726883092.50513: done checking for any_errors_fatal 33192 1726883092.50514: checking for max_fail_percentage 33192 1726883092.50515: done checking for max_fail_percentage 33192 1726883092.50516: checking to see if all hosts have failed and the running result is not ok 33192 1726883092.50518: done checking to see if all hosts have failed 33192 1726883092.50518: getting the remaining hosts for this loop 33192 1726883092.50520: done getting the remaining hosts for this loop 33192 1726883092.50523: getting the next task for host managed_node1 33192 1726883092.50530: done getting next task for host managed_node1 33192 1726883092.50533: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 33192 1726883092.50538: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33192 1726883092.50555: getting variables 33192 1726883092.50557: in VariableManager get_vars() 33192 1726883092.50620: Calling all_inventory to load vars for managed_node1 33192 1726883092.50623: Calling groups_inventory to load vars for managed_node1 33192 1726883092.50626: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883092.50637: Calling all_plugins_play to load vars for managed_node1 33192 1726883092.50641: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883092.50644: Calling groups_plugins_play to load vars for managed_node1 33192 1726883092.50924: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883092.51268: done with get_vars() 33192 1726883092.51285: done getting variables 33192 1726883092.51315: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000102 33192 1726883092.51318: WORKER PROCESS EXITING redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 33192 1726883092.51379: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:44:52 -0400 (0:00:00.026) 0:00:05.915 ****** 33192 1726883092.51457: entering _queue_task() for managed_node1/yum 33192 1726883092.51885: worker is 1 (out of 1 available) 33192 1726883092.51897: exiting _queue_task() for managed_node1/yum 33192 1726883092.51908: done queuing things up, now waiting for results queue to drain 33192 1726883092.51909: waiting for pending results... 33192 1726883092.52082: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 33192 1726883092.52321: in run() - task 0affe814-3a2d-6c15-6a7e-000000000103 33192 1726883092.52345: variable 'ansible_search_path' from source: unknown 33192 1726883092.52355: variable 'ansible_search_path' from source: unknown 33192 1726883092.52409: calling self._execute() 33192 1726883092.52607: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883092.52612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883092.52615: variable 'omit' from source: magic vars 33192 1726883092.53012: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.53032: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883092.53198: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.53212: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883092.53221: when evaluation is False, skipping this task 33192 1726883092.53228: _execute() done 33192 1726883092.53239: dumping result to json 33192 1726883092.53248: done dumping result, returning 33192 1726883092.53268: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affe814-3a2d-6c15-6a7e-000000000103] 33192 1726883092.53281: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000103 33192 1726883092.53511: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000103 33192 1726883092.53515: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883092.53566: no more pending results, returning what we have 33192 1726883092.53569: results queue empty 33192 1726883092.53570: checking for any_errors_fatal 33192 1726883092.53578: done checking for any_errors_fatal 33192 1726883092.53579: checking for max_fail_percentage 33192 1726883092.53580: done checking for max_fail_percentage 33192 1726883092.53581: checking to see if all hosts have failed and the running result is not ok 33192 1726883092.53645: done checking to see if all hosts have failed 33192 1726883092.53646: getting the remaining hosts for this loop 33192 1726883092.53647: done getting the remaining hosts for this loop 33192 1726883092.53651: getting the next task for host managed_node1 33192 1726883092.53657: done getting next task for host managed_node1 33192 1726883092.53661: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 33192 1726883092.53665: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33192 1726883092.53683: getting variables 33192 1726883092.53685: in VariableManager get_vars() 33192 1726883092.53731: Calling all_inventory to load vars for managed_node1 33192 1726883092.53735: Calling groups_inventory to load vars for managed_node1 33192 1726883092.53742: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883092.53752: Calling all_plugins_play to load vars for managed_node1 33192 1726883092.53755: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883092.53758: Calling groups_plugins_play to load vars for managed_node1 33192 1726883092.53990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883092.54576: done with get_vars() 33192 1726883092.54589: done getting variables 33192 1726883092.54685: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:44:52 -0400 (0:00:00.032) 0:00:05.947 ****** 33192 1726883092.54728: entering _queue_task() for managed_node1/fail 33192 1726883092.55059: worker is 1 (out of 1 available) 33192 1726883092.55071: exiting _queue_task() for managed_node1/fail 33192 1726883092.55082: done queuing things up, now waiting for results queue to drain 33192 1726883092.55083: waiting for pending results... 33192 1726883092.55393: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 33192 1726883092.55529: in run() - task 0affe814-3a2d-6c15-6a7e-000000000104 33192 1726883092.55598: variable 'ansible_search_path' from source: unknown 33192 1726883092.55602: variable 'ansible_search_path' from source: unknown 33192 1726883092.55633: calling self._execute() 33192 1726883092.55732: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883092.55944: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883092.55948: variable 'omit' from source: magic vars 33192 1726883092.56348: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.56366: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883092.56542: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.56557: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883092.56567: when evaluation is False, skipping this task 33192 1726883092.56576: _execute() done 33192 1726883092.56585: dumping result to json 33192 1726883092.56605: done dumping result, returning 33192 1726883092.56718: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-6c15-6a7e-000000000104] 33192 1726883092.56722: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000104 33192 1726883092.56828: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000104 33192 1726883092.56832: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883092.56886: no more pending results, returning what we have 33192 1726883092.56889: results queue empty 33192 1726883092.56889: checking for any_errors_fatal 33192 1726883092.56895: done checking for any_errors_fatal 33192 1726883092.56896: checking for max_fail_percentage 33192 1726883092.56897: done checking for max_fail_percentage 33192 1726883092.56897: checking to see if all hosts have failed and the running result is not ok 33192 1726883092.56898: done checking to see if all hosts have failed 33192 1726883092.56899: getting the remaining hosts for this loop 33192 1726883092.56900: done getting the remaining hosts for this loop 33192 1726883092.56903: getting the next task for host managed_node1 33192 1726883092.56907: done getting next task for host managed_node1 33192 1726883092.56909: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 33192 1726883092.56912: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33192 1726883092.56925: getting variables 33192 1726883092.56926: in VariableManager get_vars() 33192 1726883092.56975: Calling all_inventory to load vars for managed_node1 33192 1726883092.56977: Calling groups_inventory to load vars for managed_node1 33192 1726883092.56979: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883092.56986: Calling all_plugins_play to load vars for managed_node1 33192 1726883092.56988: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883092.56991: Calling groups_plugins_play to load vars for managed_node1 33192 1726883092.57142: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883092.57333: done with get_vars() 33192 1726883092.57343: done getting variables 33192 1726883092.57391: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:44:52 -0400 (0:00:00.026) 0:00:05.974 ****** 33192 1726883092.57420: entering _queue_task() for managed_node1/package 33192 1726883092.57601: worker is 1 (out of 1 available) 33192 1726883092.57616: exiting _queue_task() for managed_node1/package 33192 1726883092.57629: done queuing things up, now waiting for results queue to drain 33192 1726883092.57630: waiting for pending results... 33192 1726883092.57810: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 33192 1726883092.57919: in run() - task 0affe814-3a2d-6c15-6a7e-000000000105 33192 1726883092.57932: variable 'ansible_search_path' from source: unknown 33192 1726883092.57937: variable 'ansible_search_path' from source: unknown 33192 1726883092.57977: calling self._execute() 33192 1726883092.58042: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883092.58048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883092.58058: variable 'omit' from source: magic vars 33192 1726883092.58360: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.58372: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883092.58506: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.58512: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883092.58516: when evaluation is False, skipping this task 33192 1726883092.58519: _execute() done 33192 1726883092.58523: dumping result to json 33192 1726883092.58526: done dumping result, returning 33192 1726883092.58529: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0affe814-3a2d-6c15-6a7e-000000000105] 33192 1726883092.58545: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000105 33192 1726883092.58649: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000105 33192 1726883092.58653: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883092.58708: no more pending results, returning what we have 33192 1726883092.58711: results queue empty 33192 1726883092.58712: checking for any_errors_fatal 33192 1726883092.58718: done checking for any_errors_fatal 33192 1726883092.58719: checking for max_fail_percentage 33192 1726883092.58720: done checking for max_fail_percentage 33192 1726883092.58721: checking to see if all hosts have failed and the running result is not ok 33192 1726883092.58722: done checking to see if all hosts have failed 33192 1726883092.58723: getting the remaining hosts for this loop 33192 1726883092.58725: done getting the remaining hosts for this loop 33192 1726883092.58728: getting the next task for host managed_node1 33192 1726883092.58736: done getting next task for host managed_node1 33192 1726883092.58740: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 33192 1726883092.58744: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33192 1726883092.58761: getting variables 33192 1726883092.58763: in VariableManager get_vars() 33192 1726883092.58806: Calling all_inventory to load vars for managed_node1 33192 1726883092.58809: Calling groups_inventory to load vars for managed_node1 33192 1726883092.58811: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883092.58818: Calling all_plugins_play to load vars for managed_node1 33192 1726883092.58820: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883092.58822: Calling groups_plugins_play to load vars for managed_node1 33192 1726883092.59049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883092.59312: done with get_vars() 33192 1726883092.59324: done getting variables 33192 1726883092.59387: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:44:52 -0400 (0:00:00.019) 0:00:05.994 ****** 33192 1726883092.59426: entering _queue_task() for managed_node1/package 33192 1726883092.59675: worker is 1 (out of 1 available) 33192 1726883092.59690: exiting _queue_task() for managed_node1/package 33192 1726883092.59703: done queuing things up, now waiting for results queue to drain 33192 1726883092.59704: waiting for pending results... 33192 1726883092.60073: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 33192 1726883092.60097: in run() - task 0affe814-3a2d-6c15-6a7e-000000000106 33192 1726883092.60118: variable 'ansible_search_path' from source: unknown 33192 1726883092.60127: variable 'ansible_search_path' from source: unknown 33192 1726883092.60191: calling self._execute() 33192 1726883092.60280: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883092.60288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883092.60297: variable 'omit' from source: magic vars 33192 1726883092.60597: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.60610: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883092.60704: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.60716: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883092.60719: when evaluation is False, skipping this task 33192 1726883092.60722: _execute() done 33192 1726883092.60724: dumping result to json 33192 1726883092.60727: done dumping result, returning 33192 1726883092.60736: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affe814-3a2d-6c15-6a7e-000000000106] 33192 1726883092.60742: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000106 33192 1726883092.60845: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000106 33192 1726883092.60848: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883092.60896: no more pending results, returning what we have 33192 1726883092.60899: results queue empty 33192 1726883092.60900: checking for any_errors_fatal 33192 1726883092.60905: done checking for any_errors_fatal 33192 1726883092.60906: checking for max_fail_percentage 33192 1726883092.60908: done checking for max_fail_percentage 33192 1726883092.60909: checking to see if all hosts have failed and the running result is not ok 33192 1726883092.60910: done checking to see if all hosts have failed 33192 1726883092.60911: getting the remaining hosts for this loop 33192 1726883092.60912: done getting the remaining hosts for this loop 33192 1726883092.60916: getting the next task for host managed_node1 33192 1726883092.60922: done getting next task for host managed_node1 33192 1726883092.60926: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 33192 1726883092.60930: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33192 1726883092.60948: getting variables 33192 1726883092.60950: in VariableManager get_vars() 33192 1726883092.60984: Calling all_inventory to load vars for managed_node1 33192 1726883092.60985: Calling groups_inventory to load vars for managed_node1 33192 1726883092.60987: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883092.60993: Calling all_plugins_play to load vars for managed_node1 33192 1726883092.60995: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883092.60998: Calling groups_plugins_play to load vars for managed_node1 33192 1726883092.61146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883092.61340: done with get_vars() 33192 1726883092.61348: done getting variables 33192 1726883092.61397: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:44:52 -0400 (0:00:00.019) 0:00:06.014 ****** 33192 1726883092.61422: entering _queue_task() for managed_node1/package 33192 1726883092.61597: worker is 1 (out of 1 available) 33192 1726883092.61611: exiting _queue_task() for managed_node1/package 33192 1726883092.61622: done queuing things up, now waiting for results queue to drain 33192 1726883092.61624: waiting for pending results... 33192 1726883092.61797: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 33192 1726883092.61901: in run() - task 0affe814-3a2d-6c15-6a7e-000000000107 33192 1726883092.61913: variable 'ansible_search_path' from source: unknown 33192 1726883092.61918: variable 'ansible_search_path' from source: unknown 33192 1726883092.61949: calling self._execute() 33192 1726883092.62018: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883092.62025: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883092.62038: variable 'omit' from source: magic vars 33192 1726883092.62511: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.62740: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883092.62744: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.62747: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883092.62750: when evaluation is False, skipping this task 33192 1726883092.62752: _execute() done 33192 1726883092.62755: dumping result to json 33192 1726883092.62757: done dumping result, returning 33192 1726883092.62760: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affe814-3a2d-6c15-6a7e-000000000107] 33192 1726883092.62763: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000107 33192 1726883092.62833: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000107 33192 1726883092.62840: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883092.62975: no more pending results, returning what we have 33192 1726883092.62979: results queue empty 33192 1726883092.62980: checking for any_errors_fatal 33192 1726883092.62986: done checking for any_errors_fatal 33192 1726883092.62987: checking for max_fail_percentage 33192 1726883092.62989: done checking for max_fail_percentage 33192 1726883092.62989: checking to see if all hosts have failed and the running result is not ok 33192 1726883092.62991: done checking to see if all hosts have failed 33192 1726883092.62992: getting the remaining hosts for this loop 33192 1726883092.62993: done getting the remaining hosts for this loop 33192 1726883092.62997: getting the next task for host managed_node1 33192 1726883092.63003: done getting next task for host managed_node1 33192 1726883092.63007: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 33192 1726883092.63011: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33192 1726883092.63026: getting variables 33192 1726883092.63027: in VariableManager get_vars() 33192 1726883092.63070: Calling all_inventory to load vars for managed_node1 33192 1726883092.63073: Calling groups_inventory to load vars for managed_node1 33192 1726883092.63076: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883092.63084: Calling all_plugins_play to load vars for managed_node1 33192 1726883092.63087: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883092.63091: Calling groups_plugins_play to load vars for managed_node1 33192 1726883092.63388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883092.63762: done with get_vars() 33192 1726883092.63781: done getting variables 33192 1726883092.63847: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:44:52 -0400 (0:00:00.024) 0:00:06.039 ****** 33192 1726883092.63894: entering _queue_task() for managed_node1/service 33192 1726883092.64150: worker is 1 (out of 1 available) 33192 1726883092.64165: exiting _queue_task() for managed_node1/service 33192 1726883092.64177: done queuing things up, now waiting for results queue to drain 33192 1726883092.64179: waiting for pending results... 33192 1726883092.64489: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 33192 1726883092.64650: in run() - task 0affe814-3a2d-6c15-6a7e-000000000108 33192 1726883092.64675: variable 'ansible_search_path' from source: unknown 33192 1726883092.64684: variable 'ansible_search_path' from source: unknown 33192 1726883092.64727: calling self._execute() 33192 1726883092.64830: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883092.64846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883092.64874: variable 'omit' from source: magic vars 33192 1726883092.65312: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.65330: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883092.65478: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.65490: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883092.65498: when evaluation is False, skipping this task 33192 1726883092.65505: _execute() done 33192 1726883092.65527: dumping result to json 33192 1726883092.65626: done dumping result, returning 33192 1726883092.65632: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-6c15-6a7e-000000000108] 33192 1726883092.65635: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000108 33192 1726883092.65719: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000108 33192 1726883092.65722: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883092.65788: no more pending results, returning what we have 33192 1726883092.65793: results queue empty 33192 1726883092.65795: checking for any_errors_fatal 33192 1726883092.65802: done checking for any_errors_fatal 33192 1726883092.65802: checking for max_fail_percentage 33192 1726883092.65804: done checking for max_fail_percentage 33192 1726883092.65805: checking to see if all hosts have failed and the running result is not ok 33192 1726883092.65807: done checking to see if all hosts have failed 33192 1726883092.65808: getting the remaining hosts for this loop 33192 1726883092.65810: done getting the remaining hosts for this loop 33192 1726883092.65814: getting the next task for host managed_node1 33192 1726883092.65823: done getting next task for host managed_node1 33192 1726883092.65827: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 33192 1726883092.66019: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33192 1726883092.66040: getting variables 33192 1726883092.66042: in VariableManager get_vars() 33192 1726883092.66088: Calling all_inventory to load vars for managed_node1 33192 1726883092.66091: Calling groups_inventory to load vars for managed_node1 33192 1726883092.66094: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883092.66103: Calling all_plugins_play to load vars for managed_node1 33192 1726883092.66107: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883092.66110: Calling groups_plugins_play to load vars for managed_node1 33192 1726883092.66388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883092.66715: done with get_vars() 33192 1726883092.66727: done getting variables 33192 1726883092.66799: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:44:52 -0400 (0:00:00.029) 0:00:06.068 ****** 33192 1726883092.66838: entering _queue_task() for managed_node1/service 33192 1726883092.67157: worker is 1 (out of 1 available) 33192 1726883092.67171: exiting _queue_task() for managed_node1/service 33192 1726883092.67182: done queuing things up, now waiting for results queue to drain 33192 1726883092.67183: waiting for pending results... 33192 1726883092.67557: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 33192 1726883092.67659: in run() - task 0affe814-3a2d-6c15-6a7e-000000000109 33192 1726883092.67663: variable 'ansible_search_path' from source: unknown 33192 1726883092.67666: variable 'ansible_search_path' from source: unknown 33192 1726883092.67690: calling self._execute() 33192 1726883092.67790: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883092.67804: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883092.67819: variable 'omit' from source: magic vars 33192 1726883092.68342: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.68422: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883092.68510: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.68533: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883092.68544: when evaluation is False, skipping this task 33192 1726883092.68553: _execute() done 33192 1726883092.68560: dumping result to json 33192 1726883092.68569: done dumping result, returning 33192 1726883092.68580: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affe814-3a2d-6c15-6a7e-000000000109] 33192 1726883092.68590: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000109 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 33192 1726883092.68940: no more pending results, returning what we have 33192 1726883092.68943: results queue empty 33192 1726883092.68945: checking for any_errors_fatal 33192 1726883092.68950: done checking for any_errors_fatal 33192 1726883092.68951: checking for max_fail_percentage 33192 1726883092.68952: done checking for max_fail_percentage 33192 1726883092.68953: checking to see if all hosts have failed and the running result is not ok 33192 1726883092.68955: done checking to see if all hosts have failed 33192 1726883092.68956: getting the remaining hosts for this loop 33192 1726883092.68957: done getting the remaining hosts for this loop 33192 1726883092.68961: getting the next task for host managed_node1 33192 1726883092.68967: done getting next task for host managed_node1 33192 1726883092.68971: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 33192 1726883092.68975: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33192 1726883092.68992: getting variables 33192 1726883092.68994: in VariableManager get_vars() 33192 1726883092.69041: Calling all_inventory to load vars for managed_node1 33192 1726883092.69045: Calling groups_inventory to load vars for managed_node1 33192 1726883092.69047: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883092.69057: Calling all_plugins_play to load vars for managed_node1 33192 1726883092.69060: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883092.69063: Calling groups_plugins_play to load vars for managed_node1 33192 1726883092.69338: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000109 33192 1726883092.69341: WORKER PROCESS EXITING 33192 1726883092.69380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883092.69715: done with get_vars() 33192 1726883092.69727: done getting variables 33192 1726883092.69801: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:44:52 -0400 (0:00:00.029) 0:00:06.098 ****** 33192 1726883092.69842: entering _queue_task() for managed_node1/service 33192 1726883092.70093: worker is 1 (out of 1 available) 33192 1726883092.70107: exiting _queue_task() for managed_node1/service 33192 1726883092.70118: done queuing things up, now waiting for results queue to drain 33192 1726883092.70237: waiting for pending results... 33192 1726883092.70409: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 33192 1726883092.70591: in run() - task 0affe814-3a2d-6c15-6a7e-00000000010a 33192 1726883092.70611: variable 'ansible_search_path' from source: unknown 33192 1726883092.70620: variable 'ansible_search_path' from source: unknown 33192 1726883092.70668: calling self._execute() 33192 1726883092.70763: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883092.70781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883092.70806: variable 'omit' from source: magic vars 33192 1726883092.71237: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.71256: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883092.71410: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.71422: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883092.71443: when evaluation is False, skipping this task 33192 1726883092.71543: _execute() done 33192 1726883092.71548: dumping result to json 33192 1726883092.71551: done dumping result, returning 33192 1726883092.71554: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affe814-3a2d-6c15-6a7e-00000000010a] 33192 1726883092.71556: sending task result for task 0affe814-3a2d-6c15-6a7e-00000000010a 33192 1726883092.71622: done sending task result for task 0affe814-3a2d-6c15-6a7e-00000000010a 33192 1726883092.71625: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883092.71681: no more pending results, returning what we have 33192 1726883092.71685: results queue empty 33192 1726883092.71687: checking for any_errors_fatal 33192 1726883092.71694: done checking for any_errors_fatal 33192 1726883092.71695: checking for max_fail_percentage 33192 1726883092.71697: done checking for max_fail_percentage 33192 1726883092.71698: checking to see if all hosts have failed and the running result is not ok 33192 1726883092.71700: done checking to see if all hosts have failed 33192 1726883092.71701: getting the remaining hosts for this loop 33192 1726883092.71703: done getting the remaining hosts for this loop 33192 1726883092.71707: getting the next task for host managed_node1 33192 1726883092.71715: done getting next task for host managed_node1 33192 1726883092.71718: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 33192 1726883092.71724: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33192 1726883092.71747: getting variables 33192 1726883092.71749: in VariableManager get_vars() 33192 1726883092.71803: Calling all_inventory to load vars for managed_node1 33192 1726883092.71806: Calling groups_inventory to load vars for managed_node1 33192 1726883092.71809: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883092.71821: Calling all_plugins_play to load vars for managed_node1 33192 1726883092.71825: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883092.71828: Calling groups_plugins_play to load vars for managed_node1 33192 1726883092.72282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883092.72952: done with get_vars() 33192 1726883092.72964: done getting variables 33192 1726883092.73031: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:44:52 -0400 (0:00:00.032) 0:00:06.131 ****** 33192 1726883092.73071: entering _queue_task() for managed_node1/service 33192 1726883092.73400: worker is 1 (out of 1 available) 33192 1726883092.73415: exiting _queue_task() for managed_node1/service 33192 1726883092.73427: done queuing things up, now waiting for results queue to drain 33192 1726883092.73428: waiting for pending results... 33192 1726883092.73569: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 33192 1726883092.73681: in run() - task 0affe814-3a2d-6c15-6a7e-00000000010b 33192 1726883092.73693: variable 'ansible_search_path' from source: unknown 33192 1726883092.73697: variable 'ansible_search_path' from source: unknown 33192 1726883092.73728: calling self._execute() 33192 1726883092.73811: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883092.73818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883092.73851: variable 'omit' from source: magic vars 33192 1726883092.74142: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.74153: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883092.74253: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.74257: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883092.74262: when evaluation is False, skipping this task 33192 1726883092.74265: _execute() done 33192 1726883092.74270: dumping result to json 33192 1726883092.74275: done dumping result, returning 33192 1726883092.74281: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0affe814-3a2d-6c15-6a7e-00000000010b] 33192 1726883092.74287: sending task result for task 0affe814-3a2d-6c15-6a7e-00000000010b 33192 1726883092.74386: done sending task result for task 0affe814-3a2d-6c15-6a7e-00000000010b 33192 1726883092.74390: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 33192 1726883092.74437: no more pending results, returning what we have 33192 1726883092.74441: results queue empty 33192 1726883092.74442: checking for any_errors_fatal 33192 1726883092.74449: done checking for any_errors_fatal 33192 1726883092.74449: checking for max_fail_percentage 33192 1726883092.74451: done checking for max_fail_percentage 33192 1726883092.74452: checking to see if all hosts have failed and the running result is not ok 33192 1726883092.74453: done checking to see if all hosts have failed 33192 1726883092.74454: getting the remaining hosts for this loop 33192 1726883092.74455: done getting the remaining hosts for this loop 33192 1726883092.74458: getting the next task for host managed_node1 33192 1726883092.74463: done getting next task for host managed_node1 33192 1726883092.74467: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 33192 1726883092.74471: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33192 1726883092.74491: getting variables 33192 1726883092.74493: in VariableManager get_vars() 33192 1726883092.74537: Calling all_inventory to load vars for managed_node1 33192 1726883092.74540: Calling groups_inventory to load vars for managed_node1 33192 1726883092.74543: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883092.74551: Calling all_plugins_play to load vars for managed_node1 33192 1726883092.74553: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883092.74556: Calling groups_plugins_play to load vars for managed_node1 33192 1726883092.74698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883092.74907: done with get_vars() 33192 1726883092.74915: done getting variables 33192 1726883092.74960: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:44:52 -0400 (0:00:00.019) 0:00:06.150 ****** 33192 1726883092.74989: entering _queue_task() for managed_node1/copy 33192 1726883092.75185: worker is 1 (out of 1 available) 33192 1726883092.75198: exiting _queue_task() for managed_node1/copy 33192 1726883092.75209: done queuing things up, now waiting for results queue to drain 33192 1726883092.75210: waiting for pending results... 33192 1726883092.75466: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 33192 1726883092.75574: in run() - task 0affe814-3a2d-6c15-6a7e-00000000010c 33192 1726883092.75579: variable 'ansible_search_path' from source: unknown 33192 1726883092.75582: variable 'ansible_search_path' from source: unknown 33192 1726883092.75615: calling self._execute() 33192 1726883092.75685: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883092.75692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883092.75703: variable 'omit' from source: magic vars 33192 1726883092.76002: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.76240: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883092.76245: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.76248: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883092.76251: when evaluation is False, skipping this task 33192 1726883092.76254: _execute() done 33192 1726883092.76258: dumping result to json 33192 1726883092.76261: done dumping result, returning 33192 1726883092.76267: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affe814-3a2d-6c15-6a7e-00000000010c] 33192 1726883092.76269: sending task result for task 0affe814-3a2d-6c15-6a7e-00000000010c skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883092.76391: no more pending results, returning what we have 33192 1726883092.76396: results queue empty 33192 1726883092.76397: checking for any_errors_fatal 33192 1726883092.76404: done checking for any_errors_fatal 33192 1726883092.76405: checking for max_fail_percentage 33192 1726883092.76407: done checking for max_fail_percentage 33192 1726883092.76408: checking to see if all hosts have failed and the running result is not ok 33192 1726883092.76409: done checking to see if all hosts have failed 33192 1726883092.76410: getting the remaining hosts for this loop 33192 1726883092.76412: done getting the remaining hosts for this loop 33192 1726883092.76416: getting the next task for host managed_node1 33192 1726883092.76422: done getting next task for host managed_node1 33192 1726883092.76426: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 33192 1726883092.76431: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33192 1726883092.76502: done sending task result for task 0affe814-3a2d-6c15-6a7e-00000000010c 33192 1726883092.76505: WORKER PROCESS EXITING 33192 1726883092.76517: getting variables 33192 1726883092.76519: in VariableManager get_vars() 33192 1726883092.76563: Calling all_inventory to load vars for managed_node1 33192 1726883092.76566: Calling groups_inventory to load vars for managed_node1 33192 1726883092.76569: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883092.76578: Calling all_plugins_play to load vars for managed_node1 33192 1726883092.76581: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883092.76585: Calling groups_plugins_play to load vars for managed_node1 33192 1726883092.76850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883092.77036: done with get_vars() 33192 1726883092.77044: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:44:52 -0400 (0:00:00.021) 0:00:06.171 ****** 33192 1726883092.77108: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 33192 1726883092.77275: worker is 1 (out of 1 available) 33192 1726883092.77289: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 33192 1726883092.77300: done queuing things up, now waiting for results queue to drain 33192 1726883092.77302: waiting for pending results... 33192 1726883092.77474: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 33192 1726883092.77569: in run() - task 0affe814-3a2d-6c15-6a7e-00000000010d 33192 1726883092.77586: variable 'ansible_search_path' from source: unknown 33192 1726883092.77590: variable 'ansible_search_path' from source: unknown 33192 1726883092.77620: calling self._execute() 33192 1726883092.77690: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883092.77697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883092.77707: variable 'omit' from source: magic vars 33192 1726883092.78003: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.78013: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883092.78113: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.78117: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883092.78121: when evaluation is False, skipping this task 33192 1726883092.78126: _execute() done 33192 1726883092.78130: dumping result to json 33192 1726883092.78137: done dumping result, returning 33192 1726883092.78145: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affe814-3a2d-6c15-6a7e-00000000010d] 33192 1726883092.78150: sending task result for task 0affe814-3a2d-6c15-6a7e-00000000010d 33192 1726883092.78257: done sending task result for task 0affe814-3a2d-6c15-6a7e-00000000010d 33192 1726883092.78261: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883092.78322: no more pending results, returning what we have 33192 1726883092.78326: results queue empty 33192 1726883092.78327: checking for any_errors_fatal 33192 1726883092.78331: done checking for any_errors_fatal 33192 1726883092.78332: checking for max_fail_percentage 33192 1726883092.78336: done checking for max_fail_percentage 33192 1726883092.78337: checking to see if all hosts have failed and the running result is not ok 33192 1726883092.78338: done checking to see if all hosts have failed 33192 1726883092.78339: getting the remaining hosts for this loop 33192 1726883092.78340: done getting the remaining hosts for this loop 33192 1726883092.78344: getting the next task for host managed_node1 33192 1726883092.78353: done getting next task for host managed_node1 33192 1726883092.78357: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 33192 1726883092.78360: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33192 1726883092.78374: getting variables 33192 1726883092.78376: in VariableManager get_vars() 33192 1726883092.78405: Calling all_inventory to load vars for managed_node1 33192 1726883092.78407: Calling groups_inventory to load vars for managed_node1 33192 1726883092.78409: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883092.78415: Calling all_plugins_play to load vars for managed_node1 33192 1726883092.78417: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883092.78419: Calling groups_plugins_play to load vars for managed_node1 33192 1726883092.78568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883092.78755: done with get_vars() 33192 1726883092.78763: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:44:52 -0400 (0:00:00.017) 0:00:06.188 ****** 33192 1726883092.78830: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 33192 1726883092.79007: worker is 1 (out of 1 available) 33192 1726883092.79021: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 33192 1726883092.79031: done queuing things up, now waiting for results queue to drain 33192 1726883092.79033: waiting for pending results... 33192 1726883092.79203: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 33192 1726883092.79294: in run() - task 0affe814-3a2d-6c15-6a7e-00000000010e 33192 1726883092.79308: variable 'ansible_search_path' from source: unknown 33192 1726883092.79311: variable 'ansible_search_path' from source: unknown 33192 1726883092.79345: calling self._execute() 33192 1726883092.79412: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883092.79419: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883092.79430: variable 'omit' from source: magic vars 33192 1726883092.79723: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.79735: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883092.79832: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.79839: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883092.79842: when evaluation is False, skipping this task 33192 1726883092.79847: _execute() done 33192 1726883092.79850: dumping result to json 33192 1726883092.79855: done dumping result, returning 33192 1726883092.79862: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affe814-3a2d-6c15-6a7e-00000000010e] 33192 1726883092.79867: sending task result for task 0affe814-3a2d-6c15-6a7e-00000000010e skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883092.80018: no more pending results, returning what we have 33192 1726883092.80022: results queue empty 33192 1726883092.80023: checking for any_errors_fatal 33192 1726883092.80028: done checking for any_errors_fatal 33192 1726883092.80029: checking for max_fail_percentage 33192 1726883092.80031: done checking for max_fail_percentage 33192 1726883092.80031: checking to see if all hosts have failed and the running result is not ok 33192 1726883092.80033: done checking to see if all hosts have failed 33192 1726883092.80035: getting the remaining hosts for this loop 33192 1726883092.80037: done getting the remaining hosts for this loop 33192 1726883092.80041: getting the next task for host managed_node1 33192 1726883092.80048: done getting next task for host managed_node1 33192 1726883092.80052: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 33192 1726883092.80057: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33192 1726883092.80076: getting variables 33192 1726883092.80077: in VariableManager get_vars() 33192 1726883092.80109: Calling all_inventory to load vars for managed_node1 33192 1726883092.80111: Calling groups_inventory to load vars for managed_node1 33192 1726883092.80113: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883092.80121: Calling all_plugins_play to load vars for managed_node1 33192 1726883092.80123: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883092.80126: Calling groups_plugins_play to load vars for managed_node1 33192 1726883092.80306: done sending task result for task 0affe814-3a2d-6c15-6a7e-00000000010e 33192 1726883092.80310: WORKER PROCESS EXITING 33192 1726883092.80322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883092.80510: done with get_vars() 33192 1726883092.80519: done getting variables 33192 1726883092.80563: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:44:52 -0400 (0:00:00.017) 0:00:06.206 ****** 33192 1726883092.80590: entering _queue_task() for managed_node1/debug 33192 1726883092.80767: worker is 1 (out of 1 available) 33192 1726883092.80785: exiting _queue_task() for managed_node1/debug 33192 1726883092.80796: done queuing things up, now waiting for results queue to drain 33192 1726883092.80797: waiting for pending results... 33192 1726883092.80961: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 33192 1726883092.81053: in run() - task 0affe814-3a2d-6c15-6a7e-00000000010f 33192 1726883092.81067: variable 'ansible_search_path' from source: unknown 33192 1726883092.81080: variable 'ansible_search_path' from source: unknown 33192 1726883092.81112: calling self._execute() 33192 1726883092.81185: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883092.81193: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883092.81203: variable 'omit' from source: magic vars 33192 1726883092.81501: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.81513: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883092.81607: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.81614: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883092.81617: when evaluation is False, skipping this task 33192 1726883092.81620: _execute() done 33192 1726883092.81625: dumping result to json 33192 1726883092.81627: done dumping result, returning 33192 1726883092.81640: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affe814-3a2d-6c15-6a7e-00000000010f] 33192 1726883092.81644: sending task result for task 0affe814-3a2d-6c15-6a7e-00000000010f 33192 1726883092.81743: done sending task result for task 0affe814-3a2d-6c15-6a7e-00000000010f 33192 1726883092.81747: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 33192 1726883092.81795: no more pending results, returning what we have 33192 1726883092.81798: results queue empty 33192 1726883092.81799: checking for any_errors_fatal 33192 1726883092.81804: done checking for any_errors_fatal 33192 1726883092.81805: checking for max_fail_percentage 33192 1726883092.81807: done checking for max_fail_percentage 33192 1726883092.81807: checking to see if all hosts have failed and the running result is not ok 33192 1726883092.81809: done checking to see if all hosts have failed 33192 1726883092.81809: getting the remaining hosts for this loop 33192 1726883092.81811: done getting the remaining hosts for this loop 33192 1726883092.81814: getting the next task for host managed_node1 33192 1726883092.81820: done getting next task for host managed_node1 33192 1726883092.81823: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 33192 1726883092.81828: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33192 1726883092.81846: getting variables 33192 1726883092.81848: in VariableManager get_vars() 33192 1726883092.81887: Calling all_inventory to load vars for managed_node1 33192 1726883092.81890: Calling groups_inventory to load vars for managed_node1 33192 1726883092.81892: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883092.81898: Calling all_plugins_play to load vars for managed_node1 33192 1726883092.81900: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883092.81902: Calling groups_plugins_play to load vars for managed_node1 33192 1726883092.82050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883092.82245: done with get_vars() 33192 1726883092.82253: done getting variables 33192 1726883092.82301: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:44:52 -0400 (0:00:00.017) 0:00:06.223 ****** 33192 1726883092.82326: entering _queue_task() for managed_node1/debug 33192 1726883092.82510: worker is 1 (out of 1 available) 33192 1726883092.82523: exiting _queue_task() for managed_node1/debug 33192 1726883092.82536: done queuing things up, now waiting for results queue to drain 33192 1726883092.82537: waiting for pending results... 33192 1726883092.82693: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 33192 1726883092.82786: in run() - task 0affe814-3a2d-6c15-6a7e-000000000110 33192 1726883092.82800: variable 'ansible_search_path' from source: unknown 33192 1726883092.82804: variable 'ansible_search_path' from source: unknown 33192 1726883092.82833: calling self._execute() 33192 1726883092.82902: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883092.82909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883092.82919: variable 'omit' from source: magic vars 33192 1726883092.83439: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.83443: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883092.83490: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.83500: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883092.83508: when evaluation is False, skipping this task 33192 1726883092.83514: _execute() done 33192 1726883092.83521: dumping result to json 33192 1726883092.83528: done dumping result, returning 33192 1726883092.83541: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affe814-3a2d-6c15-6a7e-000000000110] 33192 1726883092.83550: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000110 skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 33192 1726883092.83696: no more pending results, returning what we have 33192 1726883092.83700: results queue empty 33192 1726883092.83701: checking for any_errors_fatal 33192 1726883092.83707: done checking for any_errors_fatal 33192 1726883092.83708: checking for max_fail_percentage 33192 1726883092.83710: done checking for max_fail_percentage 33192 1726883092.83711: checking to see if all hosts have failed and the running result is not ok 33192 1726883092.83712: done checking to see if all hosts have failed 33192 1726883092.83713: getting the remaining hosts for this loop 33192 1726883092.83715: done getting the remaining hosts for this loop 33192 1726883092.83719: getting the next task for host managed_node1 33192 1726883092.83727: done getting next task for host managed_node1 33192 1726883092.83731: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 33192 1726883092.83745: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33192 1726883092.83757: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000110 33192 1726883092.83760: WORKER PROCESS EXITING 33192 1726883092.83776: getting variables 33192 1726883092.83777: in VariableManager get_vars() 33192 1726883092.83819: Calling all_inventory to load vars for managed_node1 33192 1726883092.83822: Calling groups_inventory to load vars for managed_node1 33192 1726883092.83825: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883092.83837: Calling all_plugins_play to load vars for managed_node1 33192 1726883092.83841: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883092.83851: Calling groups_plugins_play to load vars for managed_node1 33192 1726883092.84133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883092.84460: done with get_vars() 33192 1726883092.84473: done getting variables 33192 1726883092.84537: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:44:52 -0400 (0:00:00.022) 0:00:06.246 ****** 33192 1726883092.84570: entering _queue_task() for managed_node1/debug 33192 1726883092.84806: worker is 1 (out of 1 available) 33192 1726883092.84821: exiting _queue_task() for managed_node1/debug 33192 1726883092.84949: done queuing things up, now waiting for results queue to drain 33192 1726883092.84951: waiting for pending results... 33192 1726883092.85124: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 33192 1726883092.85259: in run() - task 0affe814-3a2d-6c15-6a7e-000000000111 33192 1726883092.85275: variable 'ansible_search_path' from source: unknown 33192 1726883092.85281: variable 'ansible_search_path' from source: unknown 33192 1726883092.85315: calling self._execute() 33192 1726883092.85387: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883092.85394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883092.85407: variable 'omit' from source: magic vars 33192 1726883092.85698: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.85710: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883092.85806: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.85812: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883092.85815: when evaluation is False, skipping this task 33192 1726883092.85818: _execute() done 33192 1726883092.85827: dumping result to json 33192 1726883092.85832: done dumping result, returning 33192 1726883092.85835: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affe814-3a2d-6c15-6a7e-000000000111] 33192 1726883092.85844: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000111 33192 1726883092.85938: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000111 33192 1726883092.85942: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 33192 1726883092.85995: no more pending results, returning what we have 33192 1726883092.85998: results queue empty 33192 1726883092.86000: checking for any_errors_fatal 33192 1726883092.86004: done checking for any_errors_fatal 33192 1726883092.86005: checking for max_fail_percentage 33192 1726883092.86006: done checking for max_fail_percentage 33192 1726883092.86007: checking to see if all hosts have failed and the running result is not ok 33192 1726883092.86008: done checking to see if all hosts have failed 33192 1726883092.86009: getting the remaining hosts for this loop 33192 1726883092.86010: done getting the remaining hosts for this loop 33192 1726883092.86014: getting the next task for host managed_node1 33192 1726883092.86019: done getting next task for host managed_node1 33192 1726883092.86023: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 33192 1726883092.86028: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33192 1726883092.86045: getting variables 33192 1726883092.86047: in VariableManager get_vars() 33192 1726883092.86087: Calling all_inventory to load vars for managed_node1 33192 1726883092.86089: Calling groups_inventory to load vars for managed_node1 33192 1726883092.86091: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883092.86097: Calling all_plugins_play to load vars for managed_node1 33192 1726883092.86099: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883092.86101: Calling groups_plugins_play to load vars for managed_node1 33192 1726883092.86251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883092.86458: done with get_vars() 33192 1726883092.86466: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:44:52 -0400 (0:00:00.019) 0:00:06.265 ****** 33192 1726883092.86543: entering _queue_task() for managed_node1/ping 33192 1726883092.86724: worker is 1 (out of 1 available) 33192 1726883092.86739: exiting _queue_task() for managed_node1/ping 33192 1726883092.86751: done queuing things up, now waiting for results queue to drain 33192 1726883092.86752: waiting for pending results... 33192 1726883092.86909: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 33192 1726883092.87006: in run() - task 0affe814-3a2d-6c15-6a7e-000000000112 33192 1726883092.87019: variable 'ansible_search_path' from source: unknown 33192 1726883092.87022: variable 'ansible_search_path' from source: unknown 33192 1726883092.87059: calling self._execute() 33192 1726883092.87135: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883092.87142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883092.87152: variable 'omit' from source: magic vars 33192 1726883092.87460: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.87470: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883092.87570: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.87579: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883092.87582: when evaluation is False, skipping this task 33192 1726883092.87585: _execute() done 33192 1726883092.87590: dumping result to json 33192 1726883092.87594: done dumping result, returning 33192 1726883092.87602: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affe814-3a2d-6c15-6a7e-000000000112] 33192 1726883092.87607: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000112 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883092.87749: no more pending results, returning what we have 33192 1726883092.87755: results queue empty 33192 1726883092.87756: checking for any_errors_fatal 33192 1726883092.87764: done checking for any_errors_fatal 33192 1726883092.87765: checking for max_fail_percentage 33192 1726883092.87767: done checking for max_fail_percentage 33192 1726883092.87767: checking to see if all hosts have failed and the running result is not ok 33192 1726883092.87769: done checking to see if all hosts have failed 33192 1726883092.87770: getting the remaining hosts for this loop 33192 1726883092.87771: done getting the remaining hosts for this loop 33192 1726883092.87775: getting the next task for host managed_node1 33192 1726883092.87784: done getting next task for host managed_node1 33192 1726883092.87786: ^ task is: TASK: meta (role_complete) 33192 1726883092.87790: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33192 1726883092.87808: getting variables 33192 1726883092.87810: in VariableManager get_vars() 33192 1726883092.87848: Calling all_inventory to load vars for managed_node1 33192 1726883092.87850: Calling groups_inventory to load vars for managed_node1 33192 1726883092.87852: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883092.87860: Calling all_plugins_play to load vars for managed_node1 33192 1726883092.87862: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883092.87865: Calling groups_plugins_play to load vars for managed_node1 33192 1726883092.88010: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000112 33192 1726883092.88013: WORKER PROCESS EXITING 33192 1726883092.88025: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883092.88214: done with get_vars() 33192 1726883092.88223: done getting variables 33192 1726883092.88290: done queuing things up, now waiting for results queue to drain 33192 1726883092.88292: results queue empty 33192 1726883092.88292: checking for any_errors_fatal 33192 1726883092.88294: done checking for any_errors_fatal 33192 1726883092.88294: checking for max_fail_percentage 33192 1726883092.88295: done checking for max_fail_percentage 33192 1726883092.88296: checking to see if all hosts have failed and the running result is not ok 33192 1726883092.88297: done checking to see if all hosts have failed 33192 1726883092.88297: getting the remaining hosts for this loop 33192 1726883092.88298: done getting the remaining hosts for this loop 33192 1726883092.88300: getting the next task for host managed_node1 33192 1726883092.88303: done getting next task for host managed_node1 33192 1726883092.88304: ^ task is: TASK: Include the task 'cleanup_mock_wifi.yml' 33192 1726883092.88306: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33192 1726883092.88308: getting variables 33192 1726883092.88309: in VariableManager get_vars() 33192 1726883092.88321: Calling all_inventory to load vars for managed_node1 33192 1726883092.88323: Calling groups_inventory to load vars for managed_node1 33192 1726883092.88325: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883092.88328: Calling all_plugins_play to load vars for managed_node1 33192 1726883092.88330: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883092.88332: Calling groups_plugins_play to load vars for managed_node1 33192 1726883092.88480: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883092.88658: done with get_vars() 33192 1726883092.88665: done getting variables TASK [Include the task 'cleanup_mock_wifi.yml'] ******************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:96 Friday 20 September 2024 21:44:52 -0400 (0:00:00.021) 0:00:06.287 ****** 33192 1726883092.88721: entering _queue_task() for managed_node1/include_tasks 33192 1726883092.88913: worker is 1 (out of 1 available) 33192 1726883092.88929: exiting _queue_task() for managed_node1/include_tasks 33192 1726883092.88941: done queuing things up, now waiting for results queue to drain 33192 1726883092.88942: waiting for pending results... 33192 1726883092.89110: running TaskExecutor() for managed_node1/TASK: Include the task 'cleanup_mock_wifi.yml' 33192 1726883092.89189: in run() - task 0affe814-3a2d-6c15-6a7e-000000000142 33192 1726883092.89201: variable 'ansible_search_path' from source: unknown 33192 1726883092.89232: calling self._execute() 33192 1726883092.89305: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883092.89314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883092.89322: variable 'omit' from source: magic vars 33192 1726883092.89625: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.89637: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883092.89737: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.89743: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883092.89746: when evaluation is False, skipping this task 33192 1726883092.89751: _execute() done 33192 1726883092.89754: dumping result to json 33192 1726883092.89759: done dumping result, returning 33192 1726883092.89765: done running TaskExecutor() for managed_node1/TASK: Include the task 'cleanup_mock_wifi.yml' [0affe814-3a2d-6c15-6a7e-000000000142] 33192 1726883092.89771: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000142 33192 1726883092.89870: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000142 33192 1726883092.89873: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883092.89922: no more pending results, returning what we have 33192 1726883092.89926: results queue empty 33192 1726883092.89927: checking for any_errors_fatal 33192 1726883092.89929: done checking for any_errors_fatal 33192 1726883092.89930: checking for max_fail_percentage 33192 1726883092.89931: done checking for max_fail_percentage 33192 1726883092.89932: checking to see if all hosts have failed and the running result is not ok 33192 1726883092.89936: done checking to see if all hosts have failed 33192 1726883092.89937: getting the remaining hosts for this loop 33192 1726883092.89939: done getting the remaining hosts for this loop 33192 1726883092.89943: getting the next task for host managed_node1 33192 1726883092.89948: done getting next task for host managed_node1 33192 1726883092.89951: ^ task is: TASK: Verify network state restored to default 33192 1726883092.89954: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33192 1726883092.89957: getting variables 33192 1726883092.89959: in VariableManager get_vars() 33192 1726883092.90003: Calling all_inventory to load vars for managed_node1 33192 1726883092.90005: Calling groups_inventory to load vars for managed_node1 33192 1726883092.90007: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883092.90014: Calling all_plugins_play to load vars for managed_node1 33192 1726883092.90016: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883092.90018: Calling groups_plugins_play to load vars for managed_node1 33192 1726883092.90168: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883092.90353: done with get_vars() 33192 1726883092.90361: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:98 Friday 20 September 2024 21:44:52 -0400 (0:00:00.017) 0:00:06.304 ****** 33192 1726883092.90431: entering _queue_task() for managed_node1/include_tasks 33192 1726883092.90611: worker is 1 (out of 1 available) 33192 1726883092.90624: exiting _queue_task() for managed_node1/include_tasks 33192 1726883092.90638: done queuing things up, now waiting for results queue to drain 33192 1726883092.90640: waiting for pending results... 33192 1726883092.90806: running TaskExecutor() for managed_node1/TASK: Verify network state restored to default 33192 1726883092.90884: in run() - task 0affe814-3a2d-6c15-6a7e-000000000143 33192 1726883092.90897: variable 'ansible_search_path' from source: unknown 33192 1726883092.90928: calling self._execute() 33192 1726883092.91003: variable 'ansible_host' from source: host vars for 'managed_node1' 33192 1726883092.91011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33192 1726883092.91020: variable 'omit' from source: magic vars 33192 1726883092.91326: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.91338: Evaluated conditional (ansible_distribution_major_version != '6'): True 33192 1726883092.91432: variable 'ansible_distribution_major_version' from source: facts 33192 1726883092.91440: Evaluated conditional (ansible_distribution_major_version == '7'): False 33192 1726883092.91443: when evaluation is False, skipping this task 33192 1726883092.91448: _execute() done 33192 1726883092.91451: dumping result to json 33192 1726883092.91457: done dumping result, returning 33192 1726883092.91463: done running TaskExecutor() for managed_node1/TASK: Verify network state restored to default [0affe814-3a2d-6c15-6a7e-000000000143] 33192 1726883092.91468: sending task result for task 0affe814-3a2d-6c15-6a7e-000000000143 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33192 1726883092.91617: no more pending results, returning what we have 33192 1726883092.91621: results queue empty 33192 1726883092.91622: checking for any_errors_fatal 33192 1726883092.91628: done checking for any_errors_fatal 33192 1726883092.91629: checking for max_fail_percentage 33192 1726883092.91631: done checking for max_fail_percentage 33192 1726883092.91632: checking to see if all hosts have failed and the running result is not ok 33192 1726883092.91633: done checking to see if all hosts have failed 33192 1726883092.91636: getting the remaining hosts for this loop 33192 1726883092.91637: done getting the remaining hosts for this loop 33192 1726883092.91641: getting the next task for host managed_node1 33192 1726883092.91650: done getting next task for host managed_node1 33192 1726883092.91653: ^ task is: TASK: meta (flush_handlers) 33192 1726883092.91655: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883092.91659: getting variables 33192 1726883092.91660: in VariableManager get_vars() 33192 1726883092.91701: Calling all_inventory to load vars for managed_node1 33192 1726883092.91703: Calling groups_inventory to load vars for managed_node1 33192 1726883092.91705: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883092.91714: Calling all_plugins_play to load vars for managed_node1 33192 1726883092.91716: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883092.91719: Calling groups_plugins_play to load vars for managed_node1 33192 1726883092.91901: done sending task result for task 0affe814-3a2d-6c15-6a7e-000000000143 33192 1726883092.91906: WORKER PROCESS EXITING 33192 1726883092.91919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883092.92099: done with get_vars() 33192 1726883092.92107: done getting variables 33192 1726883092.92175: in VariableManager get_vars() 33192 1726883092.92188: Calling all_inventory to load vars for managed_node1 33192 1726883092.92190: Calling groups_inventory to load vars for managed_node1 33192 1726883092.92191: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883092.92195: Calling all_plugins_play to load vars for managed_node1 33192 1726883092.92197: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883092.92199: Calling groups_plugins_play to load vars for managed_node1 33192 1726883092.92321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883092.92504: done with get_vars() 33192 1726883092.92515: done queuing things up, now waiting for results queue to drain 33192 1726883092.92516: results queue empty 33192 1726883092.92517: checking for any_errors_fatal 33192 1726883092.92518: done checking for any_errors_fatal 33192 1726883092.92519: checking for max_fail_percentage 33192 1726883092.92520: done checking for max_fail_percentage 33192 1726883092.92520: checking to see if all hosts have failed and the running result is not ok 33192 1726883092.92521: done checking to see if all hosts have failed 33192 1726883092.92522: getting the remaining hosts for this loop 33192 1726883092.92522: done getting the remaining hosts for this loop 33192 1726883092.92524: getting the next task for host managed_node1 33192 1726883092.92526: done getting next task for host managed_node1 33192 1726883092.92528: ^ task is: TASK: meta (flush_handlers) 33192 1726883092.92529: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883092.92530: getting variables 33192 1726883092.92531: in VariableManager get_vars() 33192 1726883092.92545: Calling all_inventory to load vars for managed_node1 33192 1726883092.92546: Calling groups_inventory to load vars for managed_node1 33192 1726883092.92548: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883092.92552: Calling all_plugins_play to load vars for managed_node1 33192 1726883092.92553: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883092.92556: Calling groups_plugins_play to load vars for managed_node1 33192 1726883092.92680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883092.92870: done with get_vars() 33192 1726883092.92879: done getting variables 33192 1726883092.92917: in VariableManager get_vars() 33192 1726883092.92930: Calling all_inventory to load vars for managed_node1 33192 1726883092.92931: Calling groups_inventory to load vars for managed_node1 33192 1726883092.92933: Calling all_plugins_inventory to load vars for managed_node1 33192 1726883092.92938: Calling all_plugins_play to load vars for managed_node1 33192 1726883092.92940: Calling groups_plugins_inventory to load vars for managed_node1 33192 1726883092.92942: Calling groups_plugins_play to load vars for managed_node1 33192 1726883092.93061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33192 1726883092.93237: done with get_vars() 33192 1726883092.93246: done queuing things up, now waiting for results queue to drain 33192 1726883092.93248: results queue empty 33192 1726883092.93248: checking for any_errors_fatal 33192 1726883092.93249: done checking for any_errors_fatal 33192 1726883092.93250: checking for max_fail_percentage 33192 1726883092.93251: done checking for max_fail_percentage 33192 1726883092.93251: checking to see if all hosts have failed and the running result is not ok 33192 1726883092.93252: done checking to see if all hosts have failed 33192 1726883092.93252: getting the remaining hosts for this loop 33192 1726883092.93253: done getting the remaining hosts for this loop 33192 1726883092.93259: getting the next task for host managed_node1 33192 1726883092.93261: done getting next task for host managed_node1 33192 1726883092.93261: ^ task is: None 33192 1726883092.93262: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33192 1726883092.93263: done queuing things up, now waiting for results queue to drain 33192 1726883092.93264: results queue empty 33192 1726883092.93264: checking for any_errors_fatal 33192 1726883092.93265: done checking for any_errors_fatal 33192 1726883092.93266: checking for max_fail_percentage 33192 1726883092.93266: done checking for max_fail_percentage 33192 1726883092.93267: checking to see if all hosts have failed and the running result is not ok 33192 1726883092.93267: done checking to see if all hosts have failed 33192 1726883092.93269: getting the next task for host managed_node1 33192 1726883092.93272: done getting next task for host managed_node1 33192 1726883092.93273: ^ task is: None 33192 1726883092.93274: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node1 : ok=7 changed=0 unreachable=0 failed=0 skipped=102 rescued=0 ignored=0 Friday 20 September 2024 21:44:52 -0400 (0:00:00.028) 0:00:06.333 ****** =============================================================================== Gathering Facts --------------------------------------------------------- 1.81s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:6 Check if system is ostree ----------------------------------------------- 0.92s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Gather the minimum subset of ansible_facts required by the network role test --- 0.91s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Include the task 'enable_epel.yml' -------------------------------------- 0.09s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 fedora.linux_system_roles.network : Show debug messages for the network_connections --- 0.08s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 fedora.linux_system_roles.network : Configure networking state ---------- 0.06s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 fedora.linux_system_roles.network : Show stderr messages for the network_connections --- 0.06s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Set flag to indicate system is ostree ----------------------------------- 0.05s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.05s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 fedora.linux_system_roles.network : Show debug messages for the network_state --- 0.05s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Include the task 'el_repo_setup.yml' ------------------------------------ 0.04s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:11 Set network provider to 'nm' -------------------------------------------- 0.04s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:13 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.04s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.04s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 fedora.linux_system_roles.network : Show stderr messages for the network_connections --- 0.04s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 fedora.linux_system_roles.network : Print network provider -------------- 0.03s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider --- 0.03s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 fedora.linux_system_roles.network : Ensure initscripts network file dependency is present --- 0.03s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 fedora.linux_system_roles.network : Enable network service -------------- 0.03s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces --- 0.03s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 33192 1726883092.93356: RUNNING CLEANUP