[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 35374 1726882912.48277: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Xyq executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 35374 1726882912.49450: Added group all to inventory 35374 1726882912.49452: Added group ungrouped to inventory 35374 1726882912.49456: Group all now contains ungrouped 35374 1726882912.49459: Examining possible inventory source: /tmp/network-91m/inventory.yml 35374 1726882912.86777: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 35374 1726882912.86838: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 35374 1726882912.86978: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 35374 1726882912.87042: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 35374 1726882912.87233: Loaded config def from plugin (inventory/script) 35374 1726882912.87235: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 35374 1726882912.87405: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 35374 1726882912.88836: Loaded config def from plugin (inventory/yaml) 35374 1726882912.88838: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 35374 1726882912.88939: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 35374 1726882912.89366: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 35374 1726882912.89371: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 35374 1726882912.89374: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 35374 1726882912.89381: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 35374 1726882912.89385: Loading data from /tmp/network-91m/inventory.yml 35374 1726882912.89453: /tmp/network-91m/inventory.yml was not parsable by auto 35374 1726882912.90222: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 35374 1726882912.90262: Loading data from /tmp/network-91m/inventory.yml 35374 1726882912.90349: group all already in inventory 35374 1726882912.90356: set inventory_file for managed_node1 35374 1726882912.90360: set inventory_dir for managed_node1 35374 1726882912.90361: Added host managed_node1 to inventory 35374 1726882912.90366: Added host managed_node1 to group all 35374 1726882912.90367: set ansible_host for managed_node1 35374 1726882912.90370: set ansible_ssh_extra_args for managed_node1 35374 1726882912.90374: set inventory_file for managed_node2 35374 1726882912.90377: set inventory_dir for managed_node2 35374 1726882912.90378: Added host managed_node2 to inventory 35374 1726882912.90380: Added host managed_node2 to group all 35374 1726882912.90381: set ansible_host for managed_node2 35374 1726882912.90382: set ansible_ssh_extra_args for managed_node2 35374 1726882912.90384: set inventory_file for managed_node3 35374 1726882912.90387: set inventory_dir for managed_node3 35374 1726882912.90388: Added host managed_node3 to inventory 35374 1726882912.90389: Added host managed_node3 to group all 35374 1726882912.90390: set ansible_host for managed_node3 35374 1726882912.90390: set ansible_ssh_extra_args for managed_node3 35374 1726882912.90393: Reconcile groups and hosts in inventory. 35374 1726882912.90397: Group ungrouped now contains managed_node1 35374 1726882912.90399: Group ungrouped now contains managed_node2 35374 1726882912.90400: Group ungrouped now contains managed_node3 35374 1726882912.90482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 35374 1726882912.90614: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 35374 1726882912.90662: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 35374 1726882912.90705: Loaded config def from plugin (vars/host_group_vars) 35374 1726882912.90707: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 35374 1726882912.90714: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 35374 1726882912.90722: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 35374 1726882912.90761: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 35374 1726882912.92299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882912.92395: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 35374 1726882912.92435: Loaded config def from plugin (connection/local) 35374 1726882912.92438: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 35374 1726882912.93251: Loaded config def from plugin (connection/paramiko_ssh) 35374 1726882912.93254: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 35374 1726882912.95425: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 35374 1726882912.95465: Loaded config def from plugin (connection/psrp) 35374 1726882912.95470: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 35374 1726882912.96744: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 35374 1726882912.96903: Loaded config def from plugin (connection/ssh) 35374 1726882912.96907: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 35374 1726882913.01148: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 35374 1726882913.01197: Loaded config def from plugin (connection/winrm) 35374 1726882913.01201: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 35374 1726882913.01348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 35374 1726882913.01423: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 35374 1726882913.01581: Loaded config def from plugin (shell/cmd) 35374 1726882913.01656: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 35374 1726882913.01687: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 35374 1726882913.01875: Loaded config def from plugin (shell/powershell) 35374 1726882913.01877: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 35374 1726882913.01935: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 35374 1726882913.02367: Loaded config def from plugin (shell/sh) 35374 1726882913.02371: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 35374 1726882913.02408: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 35374 1726882913.03042: Loaded config def from plugin (become/runas) 35374 1726882913.03044: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 35374 1726882913.03474: Loaded config def from plugin (become/su) 35374 1726882913.03478: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 35374 1726882913.03883: Loaded config def from plugin (become/sudo) 35374 1726882913.03886: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 35374 1726882913.03919: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml 35374 1726882913.04703: in VariableManager get_vars() 35374 1726882913.04827: done with get_vars() 35374 1726882913.05147: trying /usr/local/lib/python3.12/site-packages/ansible/modules 35374 1726882913.12435: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 35374 1726882913.12638: in VariableManager get_vars() 35374 1726882913.12643: done with get_vars() 35374 1726882913.12646: variable 'playbook_dir' from source: magic vars 35374 1726882913.12646: variable 'ansible_playbook_python' from source: magic vars 35374 1726882913.12647: variable 'ansible_config_file' from source: magic vars 35374 1726882913.12648: variable 'groups' from source: magic vars 35374 1726882913.12648: variable 'omit' from source: magic vars 35374 1726882913.12649: variable 'ansible_version' from source: magic vars 35374 1726882913.12650: variable 'ansible_check_mode' from source: magic vars 35374 1726882913.12650: variable 'ansible_diff_mode' from source: magic vars 35374 1726882913.12651: variable 'ansible_forks' from source: magic vars 35374 1726882913.12652: variable 'ansible_inventory_sources' from source: magic vars 35374 1726882913.12652: variable 'ansible_skip_tags' from source: magic vars 35374 1726882913.12653: variable 'ansible_limit' from source: magic vars 35374 1726882913.12653: variable 'ansible_run_tags' from source: magic vars 35374 1726882913.12654: variable 'ansible_verbosity' from source: magic vars 35374 1726882913.12776: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml 35374 1726882913.14288: in VariableManager get_vars() 35374 1726882913.14307: done with get_vars() 35374 1726882913.14671: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 35374 1726882913.15160: in VariableManager get_vars() 35374 1726882913.15182: done with get_vars() 35374 1726882913.15187: variable 'omit' from source: magic vars 35374 1726882913.15272: variable 'omit' from source: magic vars 35374 1726882913.15308: in VariableManager get_vars() 35374 1726882913.15381: done with get_vars() 35374 1726882913.15479: in VariableManager get_vars() 35374 1726882913.15493: done with get_vars() 35374 1726882913.15588: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 35374 1726882913.16059: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 35374 1726882913.16381: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 35374 1726882913.18158: in VariableManager get_vars() 35374 1726882913.18205: done with get_vars() 35374 1726882913.19994: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 35374 1726882913.20241: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 35374 1726882913.23826: in VariableManager get_vars() 35374 1726882913.23846: done with get_vars() 35374 1726882913.23852: variable 'omit' from source: magic vars 35374 1726882913.23865: variable 'omit' from source: magic vars 35374 1726882913.24014: in VariableManager get_vars() 35374 1726882913.24043: done with get_vars() 35374 1726882913.24065: in VariableManager get_vars() 35374 1726882913.24083: done with get_vars() 35374 1726882913.24224: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 35374 1726882913.24459: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 35374 1726882913.24614: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 35374 1726882913.25498: in VariableManager get_vars() 35374 1726882913.25634: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 35374 1726882913.28487: in VariableManager get_vars() 35374 1726882913.28508: done with get_vars() 35374 1726882913.28512: variable 'omit' from source: magic vars 35374 1726882913.28523: variable 'omit' from source: magic vars 35374 1726882913.28558: in VariableManager get_vars() 35374 1726882913.28585: done with get_vars() 35374 1726882913.28606: in VariableManager get_vars() 35374 1726882913.28624: done with get_vars() 35374 1726882913.28653: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 35374 1726882913.28884: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 35374 1726882913.28966: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 35374 1726882913.31045: in VariableManager get_vars() 35374 1726882913.31082: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 35374 1726882913.33220: in VariableManager get_vars() 35374 1726882913.33267: done with get_vars() 35374 1726882913.33274: variable 'omit' from source: magic vars 35374 1726882913.33299: variable 'omit' from source: magic vars 35374 1726882913.33335: in VariableManager get_vars() 35374 1726882913.33357: done with get_vars() 35374 1726882913.33384: in VariableManager get_vars() 35374 1726882913.33409: done with get_vars() 35374 1726882913.33444: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 35374 1726882913.33611: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 35374 1726882913.33705: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 35374 1726882913.34221: in VariableManager get_vars() 35374 1726882913.34250: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 35374 1726882913.36832: in VariableManager get_vars() 35374 1726882913.36860: done with get_vars() 35374 1726882913.36904: in VariableManager get_vars() 35374 1726882913.37043: done with get_vars() 35374 1726882913.37112: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 35374 1726882913.37132: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 35374 1726882913.37782: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 35374 1726882913.38102: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 35374 1726882913.38104: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Xyq/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 35374 1726882913.38246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 35374 1726882913.38281: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 35374 1726882913.38643: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 35374 1726882913.38829: Loaded config def from plugin (callback/default) 35374 1726882913.38832: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 35374 1726882913.41505: Loaded config def from plugin (callback/junit) 35374 1726882913.41508: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 35374 1726882913.41557: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 35374 1726882913.41744: Loaded config def from plugin (callback/minimal) 35374 1726882913.41747: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 35374 1726882913.41790: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 35374 1726882913.41975: Loaded config def from plugin (callback/tree) 35374 1726882913.41978: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 35374 1726882913.42213: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 35374 1726882913.42216: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Xyq/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_wireless_nm.yml ************************************************ 2 plays in /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml 35374 1726882913.42356: in VariableManager get_vars() 35374 1726882913.42372: done with get_vars() 35374 1726882913.42378: in VariableManager get_vars() 35374 1726882913.43110: done with get_vars() 35374 1726882913.43116: variable 'omit' from source: magic vars 35374 1726882913.43153: in VariableManager get_vars() 35374 1726882913.43170: done with get_vars() 35374 1726882913.43190: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_wireless.yml' with nm as provider] ********* 35374 1726882913.44340: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 35374 1726882913.44416: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 35374 1726882913.44883: getting the remaining hosts for this loop 35374 1726882913.44885: done getting the remaining hosts for this loop 35374 1726882913.44887: getting the next task for host managed_node1 35374 1726882913.44891: done getting next task for host managed_node1 35374 1726882913.44893: ^ task is: TASK: Gathering Facts 35374 1726882913.44895: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882913.44897: getting variables 35374 1726882913.44898: in VariableManager get_vars() 35374 1726882913.44907: Calling all_inventory to load vars for managed_node1 35374 1726882913.44910: Calling groups_inventory to load vars for managed_node1 35374 1726882913.44912: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882913.44923: Calling all_plugins_play to load vars for managed_node1 35374 1726882913.44934: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882913.44938: Calling groups_plugins_play to load vars for managed_node1 35374 1726882913.44975: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882913.45027: done with get_vars() 35374 1726882913.45033: done getting variables 35374 1726882913.45205: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:6 Friday 20 September 2024 21:41:53 -0400 (0:00:00.030) 0:00:00.030 ****** 35374 1726882913.45226: entering _queue_task() for managed_node1/gather_facts 35374 1726882913.45227: Creating lock for gather_facts 35374 1726882913.45758: worker is 1 (out of 1 available) 35374 1726882913.45974: exiting _queue_task() for managed_node1/gather_facts 35374 1726882913.45989: done queuing things up, now waiting for results queue to drain 35374 1726882913.45991: waiting for pending results... 35374 1726882913.46600: running TaskExecutor() for managed_node1/TASK: Gathering Facts 35374 1726882913.46753: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000147 35374 1726882913.46785: variable 'ansible_search_path' from source: unknown 35374 1726882913.46859: calling self._execute() 35374 1726882913.46994: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882913.47049: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882913.47061: variable 'omit' from source: magic vars 35374 1726882913.47277: variable 'omit' from source: magic vars 35374 1726882913.47370: variable 'omit' from source: magic vars 35374 1726882913.47410: variable 'omit' from source: magic vars 35374 1726882913.47462: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 35374 1726882913.47561: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 35374 1726882913.47654: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 35374 1726882913.47678: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 35374 1726882913.47697: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 35374 1726882913.47727: variable 'inventory_hostname' from source: host vars for 'managed_node1' 35374 1726882913.47805: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882913.47815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882913.48034: Set connection var ansible_shell_type to sh 35374 1726882913.48047: Set connection var ansible_shell_executable to /bin/sh 35374 1726882913.48057: Set connection var ansible_pipelining to False 35374 1726882913.48128: Set connection var ansible_timeout to 10 35374 1726882913.48167: Set connection var ansible_module_compression to ZIP_DEFLATED 35374 1726882913.48183: Set connection var ansible_connection to ssh 35374 1726882913.48210: variable 'ansible_shell_executable' from source: unknown 35374 1726882913.48217: variable 'ansible_connection' from source: unknown 35374 1726882913.48236: variable 'ansible_module_compression' from source: unknown 35374 1726882913.48294: variable 'ansible_shell_type' from source: unknown 35374 1726882913.48302: variable 'ansible_shell_executable' from source: unknown 35374 1726882913.48317: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882913.48341: variable 'ansible_pipelining' from source: unknown 35374 1726882913.48349: variable 'ansible_timeout' from source: unknown 35374 1726882913.48367: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882913.48753: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 35374 1726882913.48841: variable 'omit' from source: magic vars 35374 1726882913.48851: starting attempt loop 35374 1726882913.48858: running the handler 35374 1726882913.48887: variable 'ansible_facts' from source: unknown 35374 1726882913.48994: _low_level_execute_command(): starting 35374 1726882913.49008: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 35374 1726882913.51042: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 35374 1726882913.51063: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 35374 1726882913.51081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 35374 1726882913.51099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 35374 1726882913.51142: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 35374 1726882913.51293: stderr chunk (state=3): >>>debug2: match not found <<< 35374 1726882913.51308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 35374 1726882913.51326: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 35374 1726882913.51337: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 35374 1726882913.51348: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 35374 1726882913.51360: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 35374 1726882913.51376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 35374 1726882913.51400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 35374 1726882913.51412: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 35374 1726882913.51422: stderr chunk (state=3): >>>debug2: match found <<< 35374 1726882913.51434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 35374 1726882913.51520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 35374 1726882913.51544: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 35374 1726882913.51558: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 35374 1726882913.51734: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 35374 1726882913.53406: stdout chunk (state=3): >>>/root <<< 35374 1726882913.53597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 35374 1726882913.53600: stdout chunk (state=3): >>><<< 35374 1726882913.53603: stderr chunk (state=3): >>><<< 35374 1726882913.53716: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 35374 1726882913.53720: _low_level_execute_command(): starting 35374 1726882913.53723: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882913.5362232-35410-123533882588517 `" && echo ansible-tmp-1726882913.5362232-35410-123533882588517="` echo /root/.ansible/tmp/ansible-tmp-1726882913.5362232-35410-123533882588517 `" ) && sleep 0' 35374 1726882913.55142: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 35374 1726882913.55153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 35374 1726882913.55185: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 35374 1726882913.55188: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 35374 1726882913.55191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 35374 1726882913.55193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 35374 1726882913.55385: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 35374 1726882913.55389: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 35374 1726882913.55442: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 35374 1726882913.55647: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 35374 1726882913.57557: stdout chunk (state=3): >>>ansible-tmp-1726882913.5362232-35410-123533882588517=/root/.ansible/tmp/ansible-tmp-1726882913.5362232-35410-123533882588517 <<< 35374 1726882913.57655: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 35374 1726882913.57731: stderr chunk (state=3): >>><<< 35374 1726882913.57735: stdout chunk (state=3): >>><<< 35374 1726882913.57874: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882913.5362232-35410-123533882588517=/root/.ansible/tmp/ansible-tmp-1726882913.5362232-35410-123533882588517 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 35374 1726882913.57878: variable 'ansible_module_compression' from source: unknown 35374 1726882913.57881: ANSIBALLZ: Using generic lock for ansible.legacy.setup 35374 1726882913.57883: ANSIBALLZ: Acquiring lock 35374 1726882913.57886: ANSIBALLZ: Lock acquired: 139643193454912 35374 1726882913.57888: ANSIBALLZ: Creating module 35374 1726882914.04822: ANSIBALLZ: Writing module into payload 35374 1726882914.05016: ANSIBALLZ: Writing module 35374 1726882914.05047: ANSIBALLZ: Renaming module 35374 1726882914.05056: ANSIBALLZ: Done creating module 35374 1726882914.05109: variable 'ansible_facts' from source: unknown 35374 1726882914.05121: variable 'inventory_hostname' from source: host vars for 'managed_node1' 35374 1726882914.05134: _low_level_execute_command(): starting 35374 1726882914.05145: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 35374 1726882914.05895: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 35374 1726882914.05909: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 35374 1726882914.05974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 35374 1726882914.06005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 35374 1726882914.06073: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 35374 1726882914.06091: stderr chunk (state=3): >>>debug2: match not found <<< 35374 1726882914.06151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 35374 1726882914.06154: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 35374 1726882914.06156: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 35374 1726882914.06158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 35374 1726882914.06222: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 35374 1726882914.06227: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 35374 1726882914.06340: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 35374 1726882914.09004: stdout chunk (state=3): >>>PLATFORM Linux FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 35374 1726882914.09034: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 35374 1726882914.09549: stderr chunk (state=3): >>><<< 35374 1726882914.09552: stdout chunk (state=3): >>><<< 35374 1726882914.09555: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 35374 1726882914.09560 [managed_node1]: found interpreters: ['/usr/bin/python3.9', '/usr/bin/python3', '/usr/bin/python3'] 35374 1726882914.09562: _low_level_execute_command(): starting 35374 1726882914.09566: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 && sleep 0' 35374 1726882914.10296: Sending initial data 35374 1726882914.10307: Sent initial data (1181 bytes) 35374 1726882914.10310: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 35374 1726882914.10419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 3 <<< 35374 1726882914.15981: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 35374 1726882914.16524: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 35374 1726882914.16573: stderr chunk (state=3): >>><<< 35374 1726882914.16578: stdout chunk (state=3): >>><<< 35374 1726882914.16592: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 3 debug2: Received exit status from master 0 35374 1726882914.16646: variable 'ansible_facts' from source: unknown 35374 1726882914.16649: variable 'ansible_facts' from source: unknown 35374 1726882914.16657: variable 'ansible_module_compression' from source: unknown 35374 1726882914.16696: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-35374mvgt63ho/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 35374 1726882914.16721: variable 'ansible_facts' from source: unknown 35374 1726882914.16831: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882913.5362232-35410-123533882588517/AnsiballZ_setup.py 35374 1726882914.16948: Sending initial data 35374 1726882914.16952: Sent initial data (154 bytes) 35374 1726882914.17614: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 35374 1726882914.17620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 35374 1726882914.17663: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 35374 1726882914.17668: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 35374 1726882914.17673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 35374 1726882914.17734: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 35374 1726882914.17737: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 35374 1726882914.17741: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 35374 1726882914.17838: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 3 <<< 35374 1726882914.20324: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 35374 1726882914.20420: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 35374 1726882914.20518: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-35374mvgt63ho/tmpkgltp9u6 /root/.ansible/tmp/ansible-tmp-1726882913.5362232-35410-123533882588517/AnsiballZ_setup.py <<< 35374 1726882914.20612: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 35374 1726882914.22610: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 35374 1726882914.22714: stderr chunk (state=3): >>><<< 35374 1726882914.22717: stdout chunk (state=3): >>><<< 35374 1726882914.22735: done transferring module to remote 35374 1726882914.22748: _low_level_execute_command(): starting 35374 1726882914.22751: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882913.5362232-35410-123533882588517/ /root/.ansible/tmp/ansible-tmp-1726882913.5362232-35410-123533882588517/AnsiballZ_setup.py && sleep 0' 35374 1726882914.23202: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 35374 1726882914.23207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 35374 1726882914.23239: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 35374 1726882914.23250: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 35374 1726882914.23302: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 35374 1726882914.23314: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 35374 1726882914.23423: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 3 <<< 35374 1726882914.25853: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 35374 1726882914.25900: stderr chunk (state=3): >>><<< 35374 1726882914.25904: stdout chunk (state=3): >>><<< 35374 1726882914.25917: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 3 debug2: Received exit status from master 0 35374 1726882914.25920: _low_level_execute_command(): starting 35374 1726882914.25923: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882913.5362232-35410-123533882588517/AnsiballZ_setup.py && sleep 0' 35374 1726882914.26357: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 35374 1726882914.26363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 35374 1726882914.26395: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 35374 1726882914.26406: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 35374 1726882914.26417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 35374 1726882914.26474: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 35374 1726882914.26479: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 35374 1726882914.26593: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 3 <<< 35374 1726882914.29448: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 35374 1726882914.29454: stdout chunk (state=3): >>>import _imp # builtin <<< 35374 1726882914.29477: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # <<< 35374 1726882914.29481: stdout chunk (state=3): >>>import '_weakref' # <<< 35374 1726882914.29563: stdout chunk (state=3): >>>import '_io' # <<< 35374 1726882914.29569: stdout chunk (state=3): >>>import 'marshal' # <<< 35374 1726882914.29616: stdout chunk (state=3): >>>import 'posix' # <<< 35374 1726882914.29651: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 35374 1726882914.29655: stdout chunk (state=3): >>># installing zipimport hook <<< 35374 1726882914.29697: stdout chunk (state=3): >>>import 'time' # <<< 35374 1726882914.29711: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 35374 1726882914.29776: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 35374 1726882914.29800: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 35374 1726882914.29818: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' <<< 35374 1726882914.29836: stdout chunk (state=3): >>>import '_codecs' # <<< 35374 1726882914.29861: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d08173dc0> <<< 35374 1726882914.29904: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 35374 1726882914.29923: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' <<< 35374 1726882914.29927: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d081183a0> <<< 35374 1726882914.29931: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d08173b20> <<< 35374 1726882914.29963: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 35374 1726882914.29986: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d08173ac0> <<< 35374 1726882914.30005: stdout chunk (state=3): >>>import '_signal' # <<< 35374 1726882914.30033: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py <<< 35374 1726882914.30036: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 35374 1726882914.30053: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d08118490> <<< 35374 1726882914.30079: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py <<< 35374 1726882914.30090: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 35374 1726882914.30102: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py <<< 35374 1726882914.30111: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 35374 1726882914.30119: stdout chunk (state=3): >>>import '_abc' # <<< 35374 1726882914.30130: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d08118940> <<< 35374 1726882914.30159: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d08118670> <<< 35374 1726882914.30186: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 35374 1726882914.30203: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 35374 1726882914.30225: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 35374 1726882914.30252: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc'<<< 35374 1726882914.30256: stdout chunk (state=3): >>> <<< 35374 1726882914.30279: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 35374 1726882914.30304: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 35374 1726882914.30322: stdout chunk (state=3): >>>import '_stat' # <<< 35374 1726882914.30329: stdout chunk (state=3): >>>import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d080cf190> <<< 35374 1726882914.30347: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 35374 1726882914.30381: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 35374 1726882914.30632: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d080cf220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d080f2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d080cf940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d08130880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' <<< 35374 1726882914.30641: stdout chunk (state=3): >>>import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d080c8d90> <<< 35374 1726882914.30704: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py <<< 35374 1726882914.30720: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 35374 1726882914.30736: stdout chunk (state=3): >>>import '_locale' # <<< 35374 1726882914.30741: stdout chunk (state=3): >>>import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d080f2d90> <<< 35374 1726882914.30848: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d08118970> <<< 35374 1726882914.30893: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 35374 1726882914.31411: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 35374 1726882914.31431: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 35374 1726882914.31467: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py <<< 35374 1726882914.31487: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 35374 1726882914.31515: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 35374 1726882914.31543: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 35374 1726882914.31574: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 35374 1726882914.31595: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 35374 1726882914.31610: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d0806eeb0> <<< 35374 1726882914.31691: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d08071f40> <<< 35374 1726882914.31711: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 35374 1726882914.31731: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 35374 1726882914.31753: stdout chunk (state=3): >>>import '_sre' # <<< 35374 1726882914.31786: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 35374 1726882914.31810: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 35374 1726882914.31837: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py <<< 35374 1726882914.31849: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 35374 1726882914.31887: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d08067610> <<< 35374 1726882914.31915: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d0806d640> <<< 35374 1726882914.31936: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d0806e370> <<< 35374 1726882914.31967: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 35374 1726882914.32067: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 35374 1726882914.32091: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 35374 1726882914.32138: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 35374 1726882914.32162: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py <<< 35374 1726882914.32182: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 35374 1726882914.32221: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' <<< 35374 1726882914.32235: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d07d4ddc0> <<< 35374 1726882914.32242: stdout chunk (state=3): >>>import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07d4d8b0> <<< 35374 1726882914.32264: stdout chunk (state=3): >>>import 'itertools' # <<< 35374 1726882914.32291: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py <<< 35374 1726882914.32297: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' <<< 35374 1726882914.32312: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07d4deb0> <<< 35374 1726882914.32337: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 35374 1726882914.32372: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 35374 1726882914.32396: stdout chunk (state=3): >>>import '_operator' # <<< 35374 1726882914.32409: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07d4df70> <<< 35374 1726882914.32428: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py <<< 35374 1726882914.32453: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' <<< 35374 1726882914.32462: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07d4de80> <<< 35374 1726882914.32485: stdout chunk (state=3): >>>import '_collections' # <<< 35374 1726882914.32549: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d08049d30> <<< 35374 1726882914.32594: stdout chunk (state=3): >>>import '_functools' # <<< 35374 1726882914.32629: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d08042610> <<< 35374 1726882914.32712: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d08055670> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d08075e20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 35374 1726882914.32728: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d07d5fc70> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d08049250> <<< 35374 1726882914.32823: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d08055280> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d0807b9d0> <<< 35374 1726882914.32846: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 35374 1726882914.32879: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 35374 1726882914.32922: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07d5ffa0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07d5fd90> <<< 35374 1726882914.32953: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07d5fd00> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 35374 1726882914.32994: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 35374 1726882914.33004: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 35374 1726882914.33041: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 35374 1726882914.33092: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07d32370> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 35374 1726882914.33111: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 35374 1726882914.33133: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07d32460> <<< 35374 1726882914.33260: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07d66fa0> <<< 35374 1726882914.33301: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07d61a30> <<< 35374 1726882914.33326: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07d61490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py <<< 35374 1726882914.33362: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 35374 1726882914.33399: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py <<< 35374 1726882914.33415: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07c5b1c0> <<< 35374 1726882914.33446: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07d1dc70> <<< 35374 1726882914.33496: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07d61eb0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d0807b040> <<< 35374 1726882914.33513: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 35374 1726882914.33575: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07c6daf0> <<< 35374 1726882914.33614: stdout chunk (state=3): >>>import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d07c6de20> <<< 35374 1726882914.33642: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 35374 1726882914.33679: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07c7f730> <<< 35374 1726882914.33692: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 35374 1726882914.33716: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 35374 1726882914.33740: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07c7fc70> <<< 35374 1726882914.33808: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d07c0c3a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07c6df10> <<< 35374 1726882914.33819: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 35374 1726882914.33867: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d07c1d280> <<< 35374 1726882914.33902: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07c7f5b0> import 'pwd' # <<< 35374 1726882914.33925: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d07c1d340> <<< 35374 1726882914.33941: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07d5f9d0> <<< 35374 1726882914.33985: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 35374 1726882914.34008: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 35374 1726882914.34055: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d07c386a0> <<< 35374 1726882914.34110: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d07c38970> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07c38760> <<< 35374 1726882914.34146: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d07c38850> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 35374 1726882914.34352: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' <<< 35374 1726882914.34372: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d07c38ca0> <<< 35374 1726882914.34380: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' <<< 35374 1726882914.34732: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d07c451f0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07c388e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07c2ca30> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07d5f5b0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07c38a90> <<< 35374 1726882914.34856: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f9d07b5a670> <<< 35374 1726882914.35182: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 35374 1726882914.35336: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.35384: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py<<< 35374 1726882914.35390: stdout chunk (state=3): >>> <<< 35374 1726882914.35418: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.35444: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.35476: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py <<< 35374 1726882914.35508: stdout chunk (state=3): >>># zipimport: zlib available<<< 35374 1726882914.35515: stdout chunk (state=3): >>> <<< 35374 1726882914.37288: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.38266: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d074bb7f0> <<< 35374 1726882914.38299: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 35374 1726882914.38329: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 35374 1726882914.38366: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 35374 1726882914.38371: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' <<< 35374 1726882914.38374: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d0754c760> <<< 35374 1726882914.38404: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d0754c640> <<< 35374 1726882914.38441: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d0754c370> <<< 35374 1726882914.38462: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 35374 1726882914.38507: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d0754c490> <<< 35374 1726882914.38526: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d0754c190> <<< 35374 1726882914.38878: stdout chunk (state=3): >>>import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d0754c400> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d0754c7c0> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d075257c0> <<< 35374 1726882914.38914: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d07525b50> <<< 35374 1726882914.38944: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' <<< 35374 1726882914.38952: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d075259a0> <<< 35374 1726882914.38966: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 35374 1726882914.38980: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 35374 1726882914.39021: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d0743e4f0> <<< 35374 1726882914.39043: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07545d30> <<< 35374 1726882914.39569: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d0754c520> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07545190> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07576a90> <<< 35374 1726882914.39594: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07519190> <<< 35374 1726882914.39599: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07519790> <<< 35374 1726882914.39616: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07444d00> <<< 35374 1726882914.39648: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' <<< 35374 1726882914.39679: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' <<< 35374 1726882914.39684: stdout chunk (state=3): >>>import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d075196a0> <<< 35374 1726882914.39730: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py <<< 35374 1726882914.39737: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' <<< 35374 1726882914.39765: stdout chunk (state=3): >>>import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d0759ad30> <<< 35374 1726882914.39800: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 35374 1726882914.39827: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 35374 1726882914.39862: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py<<< 35374 1726882914.39867: stdout chunk (state=3): >>> <<< 35374 1726882914.39926: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 35374 1726882914.40030: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' <<< 35374 1726882914.40034: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' <<< 35374 1726882914.40050: stdout chunk (state=3): >>>import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d0749c9a0> <<< 35374 1726882914.40056: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d075a5e50> <<< 35374 1726882914.40087: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 35374 1726882914.40120: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 35374 1726882914.40212: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' <<< 35374 1726882914.40216: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' <<< 35374 1726882914.40223: stdout chunk (state=3): >>>import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d074ac0d0> <<< 35374 1726882914.40238: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d075a5e20> <<< 35374 1726882914.40279: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py<<< 35374 1726882914.40282: stdout chunk (state=3): >>> <<< 35374 1726882914.40343: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 35374 1726882914.40386: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py<<< 35374 1726882914.40403: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc'<<< 35374 1726882914.40406: stdout chunk (state=3): >>> <<< 35374 1726882914.40410: stdout chunk (state=3): >>>import '_string' # <<< 35374 1726882914.40517: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d075ac220><<< 35374 1726882914.40520: stdout chunk (state=3): >>> <<< 35374 1726882914.40739: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d074ac100><<< 35374 1726882914.40743: stdout chunk (state=3): >>> <<< 35374 1726882914.40902: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' <<< 35374 1726882914.40906: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d07570b80><<< 35374 1726882914.40909: stdout chunk (state=3): >>> <<< 35374 1726882914.40987: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' <<< 35374 1726882914.40990: stdout chunk (state=3): >>>import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d075a5ac0> <<< 35374 1726882914.41086: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' <<< 35374 1726882914.41131: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d075a5d00> <<< 35374 1726882914.41134: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07b5a820> <<< 35374 1726882914.41137: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py<<< 35374 1726882914.41169: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 35374 1726882914.41207: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 35374 1726882914.41217: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 35374 1726882914.41292: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' <<< 35374 1726882914.41295: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' <<< 35374 1726882914.41298: stdout chunk (state=3): >>>import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d074a80d0> <<< 35374 1726882914.41531: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d0749e370> <<< 35374 1726882914.41536: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d074a8d00> <<< 35374 1726882914.41591: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d074a86a0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d074a9130> <<< 35374 1726882914.41594: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.41614: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available <<< 35374 1726882914.41689: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.41797: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 35374 1726882914.41824: stdout chunk (state=3): >>>import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py <<< 35374 1726882914.41830: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.41921: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.42018: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.42469: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.42935: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # <<< 35374 1726882914.42939: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 35374 1726882914.42976: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 35374 1726882914.42984: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 35374 1726882914.43027: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d074e48b0> <<< 35374 1726882914.43103: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 35374 1726882914.43116: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d074e9910> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d070406a0> <<< 35374 1726882914.43156: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py <<< 35374 1726882914.43186: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.43212: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.43215: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 35374 1726882914.43334: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.43455: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 35374 1726882914.43491: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d075237f0> <<< 35374 1726882914.43494: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.43881: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.44237: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.44296: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.44362: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py <<< 35374 1726882914.44368: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.44395: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.44447: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py <<< 35374 1726882914.44453: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.44491: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.44605: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py <<< 35374 1726882914.44610: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.44646: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.44693: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available <<< 35374 1726882914.45054: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 35374 1726882914.45081: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # <<< 35374 1726882914.45165: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07045d90> # zipimport: zlib available <<< 35374 1726882914.45220: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.45309: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py <<< 35374 1726882914.45313: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 35374 1726882914.45324: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.45361: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.45401: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available <<< 35374 1726882914.45434: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.45471: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.45804: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d074d70a0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07027430> <<< 35374 1726882914.45895: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available <<< 35374 1726882914.45944: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.45972: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.46016: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 35374 1726882914.46037: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 35374 1726882914.46074: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 35374 1726882914.46093: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 35374 1726882914.46114: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 35374 1726882914.46190: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d074e0160> <<< 35374 1726882914.46237: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d074ddcd0> <<< 35374 1726882914.46287: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07045bb0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py <<< 35374 1726882914.46306: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.46319: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.46350: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 35374 1726882914.46419: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py <<< 35374 1726882914.46450: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py <<< 35374 1726882914.46467: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.46507: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.46577: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.46596: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.46606: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.46634: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.46686: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.46708: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.46737: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available <<< 35374 1726882914.46815: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.46875: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.46908: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.46926: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available <<< 35374 1726882914.47074: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.47218: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.47241: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.47294: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 35374 1726882914.47342: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 35374 1726882914.47355: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 35374 1726882914.47381: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d06dc0a60> <<< 35374 1726882914.47407: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py <<< 35374 1726882914.47428: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 35374 1726882914.47469: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 35374 1726882914.47496: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' <<< 35374 1726882914.47507: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d070206d0> <<< 35374 1726882914.47550: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d07020af0> <<< 35374 1726882914.47601: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07006250> <<< 35374 1726882914.47621: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07006a30> <<< 35374 1726882914.47648: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07055460> <<< 35374 1726882914.47682: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07055910> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 35374 1726882914.47707: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' <<< 35374 1726882914.47723: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 35374 1726882914.47754: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d07052d00> <<< 35374 1726882914.47788: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07052d60> <<< 35374 1726882914.47800: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 35374 1726882914.47825: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07052250> <<< 35374 1726882914.47838: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 35374 1726882914.47861: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 35374 1726882914.47894: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d06e28f70> <<< 35374 1726882914.47913: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d0706a4c0> <<< 35374 1726882914.47948: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07055310> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py <<< 35374 1726882914.47979: stdout chunk (state=3): >>>import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available <<< 35374 1726882914.48006: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available <<< 35374 1726882914.48053: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.48102: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py <<< 35374 1726882914.48120: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.48149: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.48185: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py <<< 35374 1726882914.48227: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 35374 1726882914.48238: stdout chunk (state=3): >>>import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available <<< 35374 1726882914.48259: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.48288: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py <<< 35374 1726882914.48300: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.48343: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.48383: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available <<< 35374 1726882914.48421: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.48465: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available <<< 35374 1726882914.48519: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.48576: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.48613: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.48678: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py <<< 35374 1726882914.48689: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.49065: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.49429: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available <<< 35374 1726882914.49470: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.49519: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.49552: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.49582: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py <<< 35374 1726882914.49594: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.49612: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.49638: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py <<< 35374 1726882914.49666: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.49700: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.49747: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py <<< 35374 1726882914.49775: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.49802: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.49815: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available <<< 35374 1726882914.49843: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.49874: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py <<< 35374 1726882914.49893: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.49939: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.50019: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 35374 1726882914.50040: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d06d42ca0> <<< 35374 1726882914.50058: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 35374 1726882914.50091: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 35374 1726882914.50251: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d06d42fd0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py <<< 35374 1726882914.50254: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.50304: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.50367: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available <<< 35374 1726882914.50444: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.50529: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py <<< 35374 1726882914.50533: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.50582: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.50665: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available <<< 35374 1726882914.50690: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.50736: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 35374 1726882914.50755: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 35374 1726882914.50910: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d06d34370> <<< 35374 1726882914.51158: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d06d83bb0> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py <<< 35374 1726882914.51161: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.51209: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.51268: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py <<< 35374 1726882914.51272: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.51333: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.51405: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.51495: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.51641: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py <<< 35374 1726882914.51645: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.51679: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.51716: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available <<< 35374 1726882914.51756: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.51802: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 35374 1726882914.51855: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d06cbb160> <<< 35374 1726882914.51893: stdout chunk (state=3): >>>import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d06cbb2b0> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py <<< 35374 1726882914.51912: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py <<< 35374 1726882914.51917: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.51938: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.51985: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available <<< 35374 1726882914.52113: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.52251: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py <<< 35374 1726882914.52254: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.52326: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.52408: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.52441: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.52495: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py <<< 35374 1726882914.52498: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available <<< 35374 1726882914.52581: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.52597: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.52707: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.52837: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py <<< 35374 1726882914.52841: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.52941: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.53051: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py <<< 35374 1726882914.53054: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.53077: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.53112: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.53547: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.53968: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py <<< 35374 1726882914.53971: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.54052: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.54148: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py <<< 35374 1726882914.54151: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.54227: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.54325: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py <<< 35374 1726882914.54328: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.54441: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.54584: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available <<< 35374 1726882914.54613: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available <<< 35374 1726882914.54651: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.54699: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py <<< 35374 1726882914.54703: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.54777: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.54856: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.55025: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.55198: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 35374 1726882914.55210: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.55236: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.55278: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available <<< 35374 1726882914.55295: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.55326: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available <<< 35374 1726882914.55392: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.55456: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py <<< 35374 1726882914.55467: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.55497: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.55510: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available <<< 35374 1726882914.55557: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.55622: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py <<< 35374 1726882914.55625: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.55661: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.55727: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py <<< 35374 1726882914.55730: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.55935: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.56159: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py <<< 35374 1726882914.56167: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.56203: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.56265: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py <<< 35374 1726882914.56269: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.56307: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.56332: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py <<< 35374 1726882914.56335: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.56357: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.56415: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py <<< 35374 1726882914.56419: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.56432: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.56458: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py <<< 35374 1726882914.56472: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.56534: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.56617: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py <<< 35374 1726882914.56645: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available <<< 35374 1726882914.56681: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.56728: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available <<< 35374 1726882914.56767: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 35374 1726882914.56811: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.56860: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.56910: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.56996: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py <<< 35374 1726882914.56999: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available <<< 35374 1726882914.57029: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.57092: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available <<< 35374 1726882914.57246: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.57434: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py <<< 35374 1726882914.57437: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.57458: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.57504: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available <<< 35374 1726882914.57550: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.57596: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py <<< 35374 1726882914.57599: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.57666: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.57747: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py <<< 35374 1726882914.57751: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.57817: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.57898: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 35374 1726882914.57986: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882914.58998: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 35374 1726882914.59034: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py <<< 35374 1726882914.59038: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 35374 1726882914.59088: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d06d0e070> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d06d0e3a0> <<< 35374 1726882914.59176: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d06b249d0> <<< 35374 1726882914.61307: stdout chunk (state=3): >>>import 'gc' # <<< 35374 1726882914.63526: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d06d0e040> # /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d06cbbeb0> # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' <<< 35374 1726882914.63598: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d06b1c310> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d06cb6700> <<< 35374 1726882914.63985: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame <<< 35374 1726882914.64032: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 35374 1726882914.88886: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root<<< 35374 1726882914.88930: stdout chunk (state=3): >>>/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2767, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 765, "free": 2767}, "nocache": {"free": 3232, "used": 300}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA"<<< 35374 1726882914.88949: stdout chunk (state=3): >>>, "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1072, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264233795584, "block_size": 4096, "block_total": 65519355, "block_available": 64510204, "block_used": 1009151, "inode_total": 131071472, "inode_available": 130998694, "inode_used": 72778, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_fibre_channel_wwn": [], "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "41", "second": "54", "epoch": "1726882914", "epoch_int": "1726882914", "date": "2024-09-20", "time": "21:41:54", "iso8601_micro": "2024-09-21T01:41:54.838438Z", "iso8601": "2024-09-21T01:41:54Z", "iso8601_basic": "20240920T214154838438", "iso8601_basic_short": "20240920T214154", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_loadavg": {"1m": 0.64, "5m": 0.56, "15m": 0.34}, "ansible_fips": false, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipx<<< 35374 1726882914.88968: stdout chunk (state=3): >>>ip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 35374 1726882914.89476: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks <<< 35374 1726882914.89482: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings <<< 35374 1726882914.89491: stdout chunk (state=3): >>># cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1<<< 35374 1726882914.89803: stdout chunk (state=3): >>> # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 <<< 35374 1726882914.89818: stdout chunk (state=3): >>># cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 <<< 35374 1726882914.89835: stdout chunk (state=3): >>># cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect<<< 35374 1726882914.89860: stdout chunk (state=3): >>> # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile<<< 35374 1726882914.89918: stdout chunk (state=3): >>> # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner <<< 35374 1726882914.89922: stdout chunk (state=3): >>># cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform<<< 35374 1726882914.89924: stdout chunk (state=3): >>> # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd <<< 35374 1726882914.89927: stdout chunk (state=3): >>># cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket <<< 35374 1726882914.89929: stdout chunk (state=3): >>># cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six<<< 35374 1726882914.89931: stdout chunk (state=3): >>> # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes <<< 35374 1726882914.89933: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections<<< 35374 1726882914.89936: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters <<< 35374 1726882914.89938: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.validation <<< 35374 1726882914.89940: stdout chunk (state=3): >>># destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4<<< 35374 1726882914.89942: stdout chunk (state=3): >>> # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils <<< 35374 1726882914.89947: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle <<< 35374 1726882914.89949: stdout chunk (state=3): >>># cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ <<< 35374 1726882914.89970: stdout chunk (state=3): >>># cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai <<< 35374 1726882914.89973: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils<<< 35374 1726882914.89975: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time <<< 35374 1726882914.89977: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl <<< 35374 1726882914.89979: stdout chunk (state=3): >>># destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware <<< 35374 1726882914.89981: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network <<< 35374 1726882914.89983: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux <<< 35374 1726882914.89985: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 35374 1726882914.90213: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 35374 1726882914.90256: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 35374 1726882914.90305: stdout chunk (state=3): >>># destroy zipimport # destroy _compression <<< 35374 1726882914.90335: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 35374 1726882914.90351: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings <<< 35374 1726882914.90366: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 35374 1726882914.90403: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy logging # destroy argparse <<< 35374 1726882914.90474: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector <<< 35374 1726882914.90489: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle <<< 35374 1726882914.90520: stdout chunk (state=3): >>># destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime <<< 35374 1726882914.90534: stdout chunk (state=3): >>># destroy base64 <<< 35374 1726882914.90553: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json <<< 35374 1726882914.90617: stdout chunk (state=3): >>># destroy socket # destroy struct <<< 35374 1726882914.90680: stdout chunk (state=3): >>># destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux <<< 35374 1726882914.90761: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch <<< 35374 1726882914.90826: stdout chunk (state=3): >>># cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator <<< 35374 1726882914.90841: stdout chunk (state=3): >>># cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal<<< 35374 1726882914.91352: stdout chunk (state=3): >>> # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs <<< 35374 1726882914.91355: stdout chunk (state=3): >>># cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks <<< 35374 1726882914.91624: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 35374 1726882914.91682: stderr chunk (state=3): >>><<< 35374 1726882914.91685: stdout chunk (state=3): >>><<< 35374 1726882914.91813: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d08173dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d081183a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d08173b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d08173ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d08118490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d08118940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d08118670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d080cf190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d080cf220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d080f2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d080cf940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d08130880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d080c8d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d080f2d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d08118970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d0806eeb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d08071f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d08067610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d0806d640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d0806e370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d07d4ddc0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07d4d8b0> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07d4deb0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07d4df70> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07d4de80> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d08049d30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d08042610> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d08055670> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d08075e20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d07d5fc70> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d08049250> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d08055280> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d0807b9d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07d5ffa0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07d5fd90> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07d5fd00> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07d32370> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07d32460> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07d66fa0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07d61a30> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07d61490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07c5b1c0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07d1dc70> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07d61eb0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d0807b040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07c6daf0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d07c6de20> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07c7f730> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07c7fc70> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d07c0c3a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07c6df10> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d07c1d280> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07c7f5b0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d07c1d340> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07d5f9d0> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d07c386a0> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d07c38970> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07c38760> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d07c38850> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d07c38ca0> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d07c451f0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07c388e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07c2ca30> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07d5f5b0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07c38a90> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f9d07b5a670> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d074bb7f0> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d0754c760> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d0754c640> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d0754c370> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d0754c490> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d0754c190> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d0754c400> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d0754c7c0> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d075257c0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d07525b50> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d075259a0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d0743e4f0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07545d30> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d0754c520> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07545190> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07576a90> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07519190> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07519790> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07444d00> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d075196a0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d0759ad30> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d0749c9a0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d075a5e50> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d074ac0d0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d075a5e20> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d075ac220> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d074ac100> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d07570b80> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d075a5ac0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d075a5d00> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07b5a820> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d074a80d0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d0749e370> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d074a8d00> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d074a86a0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d074a9130> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d074e48b0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d074e9910> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d070406a0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d075237f0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07045d90> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d074d70a0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07027430> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d074e0160> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d074ddcd0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07045bb0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d06dc0a60> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d070206d0> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d07020af0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07006250> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07006a30> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07055460> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07055910> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d07052d00> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07052d60> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07052250> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d06e28f70> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d0706a4c0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d07055310> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d06d42ca0> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d06d42fd0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d06d34370> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d06d83bb0> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d06cbb160> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d06cbb2b0> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_x7ccpuly/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9d06d0e070> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d06d0e3a0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d06b249d0> import 'gc' # # /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d06d0e040> # /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d06cbbeb0> # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d06b1c310> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9d06cb6700> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2767, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 765, "free": 2767}, "nocache": {"free": 3232, "used": 300}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1072, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264233795584, "block_size": 4096, "block_total": 65519355, "block_available": 64510204, "block_used": 1009151, "inode_total": 131071472, "inode_available": 130998694, "inode_used": 72778, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_fibre_channel_wwn": [], "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "41", "second": "54", "epoch": "1726882914", "epoch_int": "1726882914", "date": "2024-09-20", "time": "21:41:54", "iso8601_micro": "2024-09-21T01:41:54.838438Z", "iso8601": "2024-09-21T01:41:54Z", "iso8601_basic": "20240920T214154838438", "iso8601_basic_short": "20240920T214154", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_loadavg": {"1m": 0.64, "5m": 0.56, "15m": 0.34}, "ansible_fips": false, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 3 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks [WARNING]: Platform linux on host managed_node1 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 35374 1726882914.92703: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882913.5362232-35410-123533882588517/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 35374 1726882914.92706: _low_level_execute_command(): starting 35374 1726882914.92708: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882913.5362232-35410-123533882588517/ > /dev/null 2>&1 && sleep 0' 35374 1726882914.92827: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 35374 1726882914.92830: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 35374 1726882914.92833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 35374 1726882914.92873: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 35374 1726882914.92876: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 35374 1726882914.92878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 35374 1726882914.92934: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 35374 1726882914.92937: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 35374 1726882914.93065: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 3 <<< 35374 1726882914.94852: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 35374 1726882914.94855: stderr chunk (state=3): >>><<< 35374 1726882914.94868: stdout chunk (state=3): >>><<< 35374 1726882914.95824: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 3 debug2: Received exit status from master 0 35374 1726882914.95828: handler run complete 35374 1726882914.95830: variable 'ansible_facts' from source: unknown 35374 1726882914.95833: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882914.96074: variable 'ansible_facts' from source: unknown 35374 1726882914.96569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882914.96572: attempt loop complete, returning result 35374 1726882914.96575: _execute() done 35374 1726882914.96577: dumping result to json 35374 1726882914.96579: done dumping result, returning 35374 1726882914.96581: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-ee6a-9b8c-000000000147] 35374 1726882914.96583: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000147 ok: [managed_node1] 35374 1726882914.97014: no more pending results, returning what we have 35374 1726882914.97019: results queue empty 35374 1726882914.97019: checking for any_errors_fatal 35374 1726882914.97021: done checking for any_errors_fatal 35374 1726882914.97022: checking for max_fail_percentage 35374 1726882914.97024: done checking for max_fail_percentage 35374 1726882914.97024: checking to see if all hosts have failed and the running result is not ok 35374 1726882914.97025: done checking to see if all hosts have failed 35374 1726882914.97026: getting the remaining hosts for this loop 35374 1726882914.97028: done getting the remaining hosts for this loop 35374 1726882914.97032: getting the next task for host managed_node1 35374 1726882914.97040: done getting next task for host managed_node1 35374 1726882914.97042: ^ task is: TASK: meta (flush_handlers) 35374 1726882914.97044: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882914.97049: getting variables 35374 1726882914.97051: in VariableManager get_vars() 35374 1726882914.97081: Calling all_inventory to load vars for managed_node1 35374 1726882914.97084: Calling groups_inventory to load vars for managed_node1 35374 1726882914.97087: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882914.97098: Calling all_plugins_play to load vars for managed_node1 35374 1726882914.97101: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882914.97103: Calling groups_plugins_play to load vars for managed_node1 35374 1726882914.97288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882914.97488: done with get_vars() 35374 1726882914.97498: done getting variables 35374 1726882914.97530: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000147 35374 1726882914.97534: WORKER PROCESS EXITING 35374 1726882914.97581: in VariableManager get_vars() 35374 1726882914.97590: Calling all_inventory to load vars for managed_node1 35374 1726882914.97593: Calling groups_inventory to load vars for managed_node1 35374 1726882914.97595: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882914.97600: Calling all_plugins_play to load vars for managed_node1 35374 1726882914.97602: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882914.97605: Calling groups_plugins_play to load vars for managed_node1 35374 1726882914.97735: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882914.98123: done with get_vars() 35374 1726882914.98137: done queuing things up, now waiting for results queue to drain 35374 1726882914.98140: results queue empty 35374 1726882914.98141: checking for any_errors_fatal 35374 1726882914.98143: done checking for any_errors_fatal 35374 1726882914.98144: checking for max_fail_percentage 35374 1726882914.98145: done checking for max_fail_percentage 35374 1726882914.98153: checking to see if all hosts have failed and the running result is not ok 35374 1726882914.98154: done checking to see if all hosts have failed 35374 1726882914.98155: getting the remaining hosts for this loop 35374 1726882914.98156: done getting the remaining hosts for this loop 35374 1726882914.98158: getting the next task for host managed_node1 35374 1726882914.98166: done getting next task for host managed_node1 35374 1726882914.98168: ^ task is: TASK: Include the task 'el_repo_setup.yml' 35374 1726882914.98170: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882914.98172: getting variables 35374 1726882914.98178: in VariableManager get_vars() 35374 1726882914.98202: Calling all_inventory to load vars for managed_node1 35374 1726882914.98205: Calling groups_inventory to load vars for managed_node1 35374 1726882914.98207: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882914.98212: Calling all_plugins_play to load vars for managed_node1 35374 1726882914.98215: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882914.98218: Calling groups_plugins_play to load vars for managed_node1 35374 1726882914.98392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882914.98593: done with get_vars() 35374 1726882914.98600: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:11 Friday 20 September 2024 21:41:54 -0400 (0:00:01.534) 0:00:01.565 ****** 35374 1726882914.98671: entering _queue_task() for managed_node1/include_tasks 35374 1726882914.98672: Creating lock for include_tasks 35374 1726882914.98943: worker is 1 (out of 1 available) 35374 1726882914.98955: exiting _queue_task() for managed_node1/include_tasks 35374 1726882914.98968: done queuing things up, now waiting for results queue to drain 35374 1726882914.98970: waiting for pending results... 35374 1726882914.99192: running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' 35374 1726882914.99285: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000006 35374 1726882914.99304: variable 'ansible_search_path' from source: unknown 35374 1726882914.99343: calling self._execute() 35374 1726882914.99415: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882914.99428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882914.99440: variable 'omit' from source: magic vars 35374 1726882914.99542: _execute() done 35374 1726882914.99549: dumping result to json 35374 1726882914.99556: done dumping result, returning 35374 1726882914.99568: done running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' [0e448fcc-3ce9-ee6a-9b8c-000000000006] 35374 1726882914.99579: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000006 35374 1726882914.99711: no more pending results, returning what we have 35374 1726882914.99717: in VariableManager get_vars() 35374 1726882914.99750: Calling all_inventory to load vars for managed_node1 35374 1726882914.99753: Calling groups_inventory to load vars for managed_node1 35374 1726882914.99757: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882914.99772: Calling all_plugins_play to load vars for managed_node1 35374 1726882914.99776: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882914.99779: Calling groups_plugins_play to load vars for managed_node1 35374 1726882914.99940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882915.00140: done with get_vars() 35374 1726882915.00147: variable 'ansible_search_path' from source: unknown 35374 1726882915.00162: we have included files to process 35374 1726882915.00165: generating all_blocks data 35374 1726882915.00167: done generating all_blocks data 35374 1726882915.00167: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 35374 1726882915.00168: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 35374 1726882915.00171: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 35374 1726882915.01015: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000006 35374 1726882915.01019: WORKER PROCESS EXITING 35374 1726882915.01115: in VariableManager get_vars() 35374 1726882915.01132: done with get_vars() 35374 1726882915.01146: done processing included file 35374 1726882915.01148: iterating over new_blocks loaded from include file 35374 1726882915.01149: in VariableManager get_vars() 35374 1726882915.01158: done with get_vars() 35374 1726882915.01159: filtering new block on tags 35374 1726882915.01176: done filtering new block on tags 35374 1726882915.01180: in VariableManager get_vars() 35374 1726882915.01191: done with get_vars() 35374 1726882915.01192: filtering new block on tags 35374 1726882915.01209: done filtering new block on tags 35374 1726882915.01211: in VariableManager get_vars() 35374 1726882915.01221: done with get_vars() 35374 1726882915.01222: filtering new block on tags 35374 1726882915.01236: done filtering new block on tags 35374 1726882915.01238: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node1 35374 1726882915.01244: extending task lists for all hosts with included blocks 35374 1726882915.01292: done extending task lists 35374 1726882915.01293: done processing included files 35374 1726882915.01294: results queue empty 35374 1726882915.01295: checking for any_errors_fatal 35374 1726882915.01296: done checking for any_errors_fatal 35374 1726882915.01297: checking for max_fail_percentage 35374 1726882915.01298: done checking for max_fail_percentage 35374 1726882915.01299: checking to see if all hosts have failed and the running result is not ok 35374 1726882915.01300: done checking to see if all hosts have failed 35374 1726882915.01301: getting the remaining hosts for this loop 35374 1726882915.01302: done getting the remaining hosts for this loop 35374 1726882915.01304: getting the next task for host managed_node1 35374 1726882915.01307: done getting next task for host managed_node1 35374 1726882915.01309: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 35374 1726882915.01312: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882915.01314: getting variables 35374 1726882915.01315: in VariableManager get_vars() 35374 1726882915.01322: Calling all_inventory to load vars for managed_node1 35374 1726882915.01324: Calling groups_inventory to load vars for managed_node1 35374 1726882915.01326: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882915.01331: Calling all_plugins_play to load vars for managed_node1 35374 1726882915.01333: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882915.01336: Calling groups_plugins_play to load vars for managed_node1 35374 1726882915.01485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882915.01675: done with get_vars() 35374 1726882915.01683: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 21:41:55 -0400 (0:00:00.030) 0:00:01.596 ****** 35374 1726882915.01744: entering _queue_task() for managed_node1/setup 35374 1726882915.01974: worker is 1 (out of 1 available) 35374 1726882915.01984: exiting _queue_task() for managed_node1/setup 35374 1726882915.01995: done queuing things up, now waiting for results queue to drain 35374 1726882915.01997: waiting for pending results... 35374 1726882915.02223: running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 35374 1726882915.02324: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000158 35374 1726882915.02347: variable 'ansible_search_path' from source: unknown 35374 1726882915.02354: variable 'ansible_search_path' from source: unknown 35374 1726882915.02396: calling self._execute() 35374 1726882915.02470: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882915.02481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882915.02495: variable 'omit' from source: magic vars 35374 1726882915.03074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 35374 1726882915.05321: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 35374 1726882915.05390: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 35374 1726882915.05435: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 35374 1726882915.05476: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 35374 1726882915.05512: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 35374 1726882915.05592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 35374 1726882915.05641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 35374 1726882915.05675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 35374 1726882915.05726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 35374 1726882915.05747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 35374 1726882915.05920: variable 'ansible_facts' from source: unknown 35374 1726882915.05996: variable 'network_test_required_facts' from source: task vars 35374 1726882915.06035: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): False 35374 1726882915.06044: when evaluation is False, skipping this task 35374 1726882915.06054: _execute() done 35374 1726882915.06060: dumping result to json 35374 1726882915.06071: done dumping result, returning 35374 1726882915.06082: done running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [0e448fcc-3ce9-ee6a-9b8c-000000000158] 35374 1726882915.06092: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000158 skipping: [managed_node1] => { "changed": false, "false_condition": "not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts", "skip_reason": "Conditional result was False" } 35374 1726882915.06253: no more pending results, returning what we have 35374 1726882915.06257: results queue empty 35374 1726882915.06258: checking for any_errors_fatal 35374 1726882915.06260: done checking for any_errors_fatal 35374 1726882915.06260: checking for max_fail_percentage 35374 1726882915.06262: done checking for max_fail_percentage 35374 1726882915.06262: checking to see if all hosts have failed and the running result is not ok 35374 1726882915.06265: done checking to see if all hosts have failed 35374 1726882915.06266: getting the remaining hosts for this loop 35374 1726882915.06268: done getting the remaining hosts for this loop 35374 1726882915.06271: getting the next task for host managed_node1 35374 1726882915.06283: done getting next task for host managed_node1 35374 1726882915.06286: ^ task is: TASK: Check if system is ostree 35374 1726882915.06288: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882915.06292: getting variables 35374 1726882915.06294: in VariableManager get_vars() 35374 1726882915.06322: Calling all_inventory to load vars for managed_node1 35374 1726882915.06324: Calling groups_inventory to load vars for managed_node1 35374 1726882915.06328: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882915.06338: Calling all_plugins_play to load vars for managed_node1 35374 1726882915.06341: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882915.06344: Calling groups_plugins_play to load vars for managed_node1 35374 1726882915.06511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882915.06710: done with get_vars() 35374 1726882915.06720: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 21:41:55 -0400 (0:00:00.050) 0:00:01.646 ****** 35374 1726882915.06815: entering _queue_task() for managed_node1/stat 35374 1726882915.06833: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000158 35374 1726882915.06842: WORKER PROCESS EXITING 35374 1726882915.07299: worker is 1 (out of 1 available) 35374 1726882915.07311: exiting _queue_task() for managed_node1/stat 35374 1726882915.07323: done queuing things up, now waiting for results queue to drain 35374 1726882915.07324: waiting for pending results... 35374 1726882915.07551: running TaskExecutor() for managed_node1/TASK: Check if system is ostree 35374 1726882915.07651: in run() - task 0e448fcc-3ce9-ee6a-9b8c-00000000015a 35374 1726882915.07677: variable 'ansible_search_path' from source: unknown 35374 1726882915.07686: variable 'ansible_search_path' from source: unknown 35374 1726882915.07725: calling self._execute() 35374 1726882915.07799: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882915.07810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882915.07824: variable 'omit' from source: magic vars 35374 1726882915.08330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 35374 1726882915.08566: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 35374 1726882915.08614: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 35374 1726882915.08656: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 35374 1726882915.08697: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 35374 1726882915.08784: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 35374 1726882915.08814: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 35374 1726882915.08846: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 35374 1726882915.08884: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 35374 1726882915.09004: Evaluated conditional (not __network_is_ostree is defined): True 35374 1726882915.09016: variable 'omit' from source: magic vars 35374 1726882915.09054: variable 'omit' from source: magic vars 35374 1726882915.09100: variable 'omit' from source: magic vars 35374 1726882915.09130: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 35374 1726882915.09161: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 35374 1726882915.09188: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 35374 1726882915.09211: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 35374 1726882915.09225: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 35374 1726882915.09256: variable 'inventory_hostname' from source: host vars for 'managed_node1' 35374 1726882915.09267: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882915.09276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882915.09368: Set connection var ansible_shell_type to sh 35374 1726882915.09382: Set connection var ansible_shell_executable to /bin/sh 35374 1726882915.09392: Set connection var ansible_pipelining to False 35374 1726882915.09401: Set connection var ansible_timeout to 10 35374 1726882915.09417: Set connection var ansible_module_compression to ZIP_DEFLATED 35374 1726882915.09427: Set connection var ansible_connection to ssh 35374 1726882915.09454: variable 'ansible_shell_executable' from source: unknown 35374 1726882915.09462: variable 'ansible_connection' from source: unknown 35374 1726882915.09473: variable 'ansible_module_compression' from source: unknown 35374 1726882915.09482: variable 'ansible_shell_type' from source: unknown 35374 1726882915.09489: variable 'ansible_shell_executable' from source: unknown 35374 1726882915.09495: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882915.09503: variable 'ansible_pipelining' from source: unknown 35374 1726882915.09509: variable 'ansible_timeout' from source: unknown 35374 1726882915.09521: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882915.09658: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 35374 1726882915.09677: variable 'omit' from source: magic vars 35374 1726882915.09687: starting attempt loop 35374 1726882915.09694: running the handler 35374 1726882915.09709: _low_level_execute_command(): starting 35374 1726882915.09721: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 35374 1726882915.10468: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 35374 1726882915.10486: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 35374 1726882915.10506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 35374 1726882915.10526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 35374 1726882915.10572: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 35374 1726882915.10586: stderr chunk (state=3): >>>debug2: match not found <<< 35374 1726882915.10603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 35374 1726882915.10625: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 35374 1726882915.10639: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 35374 1726882915.10651: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 35374 1726882915.10666: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 35374 1726882915.10684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 35374 1726882915.10701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 35374 1726882915.10718: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 35374 1726882915.10731: stderr chunk (state=3): >>>debug2: match found <<< 35374 1726882915.10746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 35374 1726882915.10822: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 35374 1726882915.10846: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 35374 1726882915.10862: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 35374 1726882915.10989: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 35374 1726882915.12649: stdout chunk (state=3): >>>/root <<< 35374 1726882915.12761: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 35374 1726882915.12815: stderr chunk (state=3): >>><<< 35374 1726882915.12819: stdout chunk (state=3): >>><<< 35374 1726882915.12837: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 35374 1726882915.12850: _low_level_execute_command(): starting 35374 1726882915.12856: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882915.1283703-35477-247817537395111 `" && echo ansible-tmp-1726882915.1283703-35477-247817537395111="` echo /root/.ansible/tmp/ansible-tmp-1726882915.1283703-35477-247817537395111 `" ) && sleep 0' 35374 1726882915.13328: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 35374 1726882915.13332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 35374 1726882915.13376: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 35374 1726882915.13379: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 35374 1726882915.13381: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 35374 1726882915.13383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 35374 1726882915.13386: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 35374 1726882915.13430: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 35374 1726882915.13433: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 35374 1726882915.13535: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 35374 1726882915.16095: stdout chunk (state=3): >>>ansible-tmp-1726882915.1283703-35477-247817537395111=/root/.ansible/tmp/ansible-tmp-1726882915.1283703-35477-247817537395111 <<< 35374 1726882915.16144: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 35374 1726882915.16200: stderr chunk (state=3): >>><<< 35374 1726882915.16206: stdout chunk (state=3): >>><<< 35374 1726882915.16230: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882915.1283703-35477-247817537395111=/root/.ansible/tmp/ansible-tmp-1726882915.1283703-35477-247817537395111 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 35374 1726882915.16271: variable 'ansible_module_compression' from source: unknown 35374 1726882915.16325: ANSIBALLZ: Using lock for stat 35374 1726882915.16329: ANSIBALLZ: Acquiring lock 35374 1726882915.16331: ANSIBALLZ: Lock acquired: 139643193456304 35374 1726882915.16333: ANSIBALLZ: Creating module 35374 1726882915.24817: ANSIBALLZ: Writing module into payload 35374 1726882915.24897: ANSIBALLZ: Writing module 35374 1726882915.24912: ANSIBALLZ: Renaming module 35374 1726882915.24917: ANSIBALLZ: Done creating module 35374 1726882915.24930: variable 'ansible_facts' from source: unknown 35374 1726882915.24980: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882915.1283703-35477-247817537395111/AnsiballZ_stat.py 35374 1726882915.25103: Sending initial data 35374 1726882915.25113: Sent initial data (153 bytes) 35374 1726882915.25785: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 35374 1726882915.25788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 35374 1726882915.25822: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 35374 1726882915.25826: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 35374 1726882915.25830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 35374 1726882915.25879: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 35374 1726882915.25891: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 35374 1726882915.26004: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 35374 1726882915.28561: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 35374 1726882915.28657: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 35374 1726882915.28760: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-35374mvgt63ho/tmpfgn8j7d1 /root/.ansible/tmp/ansible-tmp-1726882915.1283703-35477-247817537395111/AnsiballZ_stat.py <<< 35374 1726882915.28857: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 35374 1726882915.30078: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 35374 1726882915.30292: stderr chunk (state=3): >>><<< 35374 1726882915.30295: stdout chunk (state=3): >>><<< 35374 1726882915.30297: done transferring module to remote 35374 1726882915.30300: _low_level_execute_command(): starting 35374 1726882915.30302: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882915.1283703-35477-247817537395111/ /root/.ansible/tmp/ansible-tmp-1726882915.1283703-35477-247817537395111/AnsiballZ_stat.py && sleep 0' 35374 1726882915.30875: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 35374 1726882915.30889: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 35374 1726882915.30909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 35374 1726882915.30926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 35374 1726882915.30974: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 35374 1726882915.30986: stderr chunk (state=3): >>>debug2: match not found <<< 35374 1726882915.30999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 35374 1726882915.31015: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 35374 1726882915.31025: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 35374 1726882915.31034: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 35374 1726882915.31045: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 35374 1726882915.31060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 35374 1726882915.31086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 35374 1726882915.31099: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 35374 1726882915.31109: stderr chunk (state=3): >>>debug2: match found <<< 35374 1726882915.31121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 35374 1726882915.31204: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 35374 1726882915.31225: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 35374 1726882915.31241: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 35374 1726882915.31369: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 35374 1726882915.33797: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 35374 1726882915.33882: stderr chunk (state=3): >>><<< 35374 1726882915.33892: stdout chunk (state=3): >>><<< 35374 1726882915.33994: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 35374 1726882915.33997: _low_level_execute_command(): starting 35374 1726882915.34000: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882915.1283703-35477-247817537395111/AnsiballZ_stat.py && sleep 0' 35374 1726882915.34555: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 35374 1726882915.34572: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 35374 1726882915.34587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 35374 1726882915.34605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 35374 1726882915.34653: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 35374 1726882915.34668: stderr chunk (state=3): >>>debug2: match not found <<< 35374 1726882915.34683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 35374 1726882915.34700: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 35374 1726882915.34711: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 35374 1726882915.34722: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 35374 1726882915.34735: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 35374 1726882915.34749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 35374 1726882915.34775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 35374 1726882915.34787: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 35374 1726882915.34797: stderr chunk (state=3): >>>debug2: match found <<< 35374 1726882915.34810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 35374 1726882915.34886: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 35374 1726882915.34908: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 35374 1726882915.34925: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 35374 1726882915.35055: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 35374 1726882915.37800: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 35374 1726882915.37803: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # <<< 35374 1726882915.37806: stdout chunk (state=3): >>>import '_weakref' # <<< 35374 1726882915.37891: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 35374 1726882915.37941: stdout chunk (state=3): >>>import 'posix' # <<< 35374 1726882915.37979: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 35374 1726882915.38044: stdout chunk (state=3): >>>import 'time' # <<< 35374 1726882915.38048: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 35374 1726882915.38102: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 35374 1726882915.38132: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 35374 1726882915.38176: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # <<< 35374 1726882915.38198: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e282d8dc0> <<< 35374 1726882915.38250: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 35374 1726882915.38280: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e2827d3a0> <<< 35374 1726882915.38300: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e282d8b20> <<< 35374 1726882915.38313: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 35374 1726882915.38334: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e282d8ac0> <<< 35374 1726882915.38357: stdout chunk (state=3): >>>import '_signal' # <<< 35374 1726882915.38393: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 35374 1726882915.38407: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e2827d490> <<< 35374 1726882915.38432: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 35374 1726882915.38481: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 35374 1726882915.38503: stdout chunk (state=3): >>>import '_abc' # <<< 35374 1726882915.38507: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e2827d940> <<< 35374 1726882915.38518: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e2827d670> <<< 35374 1726882915.38570: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 35374 1726882915.38603: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 35374 1726882915.38616: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 35374 1726882915.38637: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 35374 1726882915.38659: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 35374 1726882915.38690: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e28234190> <<< 35374 1726882915.38740: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 35374 1726882915.38859: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e28234220> <<< 35374 1726882915.38895: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py <<< 35374 1726882915.38923: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 35374 1726882915.38951: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e28257850> <<< 35374 1726882915.38954: stdout chunk (state=3): >>>import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e28234940> <<< 35374 1726882915.38987: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e28295880> <<< 35374 1726882915.39023: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' <<< 35374 1726882915.39037: stdout chunk (state=3): >>>import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e2822dd90> <<< 35374 1726882915.39102: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 35374 1726882915.39119: stdout chunk (state=3): >>>import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e28257d90> <<< 35374 1726882915.39186: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e2827d970> <<< 35374 1726882915.39220: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 35374 1726882915.39529: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 35374 1726882915.39541: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 35374 1726882915.39592: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py <<< 35374 1726882915.39595: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 35374 1726882915.39606: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 35374 1726882915.39624: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 35374 1726882915.39642: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 35374 1726882915.39683: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 35374 1726882915.39686: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27faeeb0> <<< 35374 1726882915.39747: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27fb1f40> <<< 35374 1726882915.39768: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 35374 1726882915.39792: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 35374 1726882915.39812: stdout chunk (state=3): >>>import '_sre' # <<< 35374 1726882915.39829: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 35374 1726882915.39840: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 35374 1726882915.39871: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 35374 1726882915.39912: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27fa7610> <<< 35374 1726882915.39933: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27fad640> <<< 35374 1726882915.39956: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27fae370> <<< 35374 1726882915.39962: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 35374 1726882915.40052: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 35374 1726882915.40072: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 35374 1726882915.40126: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 35374 1726882915.40144: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 35374 1726882915.40181: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e27f2fe20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27f2f910> <<< 35374 1726882915.40208: stdout chunk (state=3): >>>import 'itertools' # <<< 35374 1726882915.40241: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27f2ff10> <<< 35374 1726882915.40262: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 35374 1726882915.40285: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 35374 1726882915.40316: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27f2ffd0> <<< 35374 1726882915.40357: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' <<< 35374 1726882915.40376: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27f420d0> import '_collections' # <<< 35374 1726882915.40441: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27f89d90> import '_functools' # <<< 35374 1726882915.40478: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27f82670> <<< 35374 1726882915.40562: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' <<< 35374 1726882915.40587: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27f956d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27fb5e20> <<< 35374 1726882915.40598: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 35374 1726882915.40635: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e27f42cd0> <<< 35374 1726882915.40649: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27f892b0> <<< 35374 1726882915.40697: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e27f952e0> <<< 35374 1726882915.40709: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27fbb9d0> <<< 35374 1726882915.40733: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py <<< 35374 1726882915.40742: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 35374 1726882915.40772: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 35374 1726882915.40801: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 35374 1726882915.40823: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27f42eb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27f42df0> <<< 35374 1726882915.40863: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27f42d60> <<< 35374 1726882915.40900: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py <<< 35374 1726882915.40917: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 35374 1726882915.40935: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 35374 1726882915.40970: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 35374 1726882915.41029: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 35374 1726882915.41067: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27f153d0> <<< 35374 1726882915.41095: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 35374 1726882915.41119: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 35374 1726882915.41157: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27f154c0> <<< 35374 1726882915.41343: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27f49f40> <<< 35374 1726882915.41386: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27f44a90> <<< 35374 1726882915.41410: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27f44490> <<< 35374 1726882915.41432: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py <<< 35374 1726882915.41456: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 35374 1726882915.41485: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 35374 1726882915.41509: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 35374 1726882915.41539: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py <<< 35374 1726882915.41562: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27e49220> <<< 35374 1726882915.41599: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27f00520> <<< 35374 1726882915.41682: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27f44f10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27fbb040> <<< 35374 1726882915.41706: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 35374 1726882915.41738: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 35374 1726882915.41772: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' <<< 35374 1726882915.41797: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27e5bb50> import 'errno' # <<< 35374 1726882915.41826: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e27e5be80> <<< 35374 1726882915.41850: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py <<< 35374 1726882915.41885: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 35374 1726882915.41910: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27e6c790> <<< 35374 1726882915.41936: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 35374 1726882915.41979: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 35374 1726882915.42017: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27e6ccd0> <<< 35374 1726882915.42043: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e27e05400> <<< 35374 1726882915.42060: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27e5bf70> <<< 35374 1726882915.42089: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py <<< 35374 1726882915.42098: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 35374 1726882915.42140: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e27e162e0> <<< 35374 1726882915.42168: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27e6c610> <<< 35374 1726882915.42201: stdout chunk (state=3): >>>import 'pwd' # <<< 35374 1726882915.42214: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e27e163a0> <<< 35374 1726882915.42268: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27f42a30> <<< 35374 1726882915.42282: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 35374 1726882915.42312: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 35374 1726882915.42337: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 35374 1726882915.42350: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 35374 1726882915.42383: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e27e31700> <<< 35374 1726882915.42407: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 35374 1726882915.42444: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e27e319d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27e317c0> <<< 35374 1726882915.42483: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e27e318b0> <<< 35374 1726882915.42525: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 35374 1726882915.43067: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e27e31d00> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e27e3c250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27e31940> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27e25a90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27f42610> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27e31af0> <<< 35374 1726882915.43156: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 35374 1726882915.43171: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f6e27d5a6d0> <<< 35374 1726882915.43380: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip' <<< 35374 1726882915.43383: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882915.43555: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882915.43603: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/__init__.py <<< 35374 1726882915.43628: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882915.43644: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882915.43648: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/__init__.py <<< 35374 1726882915.43676: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882915.45662: stdout chunk (state=3): >>># zipimport: zlib available<<< 35374 1726882915.45670: stdout chunk (state=3): >>> <<< 35374 1726882915.47276: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py<<< 35374 1726882915.47280: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' <<< 35374 1726882915.47282: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27724820> <<< 35374 1726882915.47331: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py <<< 35374 1726882915.47334: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc'<<< 35374 1726882915.47337: stdout chunk (state=3): >>> <<< 35374 1726882915.47396: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py <<< 35374 1726882915.47400: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc'<<< 35374 1726882915.47402: stdout chunk (state=3): >>> <<< 35374 1726882915.47449: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 35374 1726882915.47526: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e277b3730> <<< 35374 1726882915.47613: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e277b3610> <<< 35374 1726882915.47710: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e277b3340> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py <<< 35374 1726882915.47716: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 35374 1726882915.47799: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e277b3460> <<< 35374 1726882915.47817: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e277b3160> <<< 35374 1726882915.47858: stdout chunk (state=3): >>>import 'atexit' # <<< 35374 1726882915.47903: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e277b33a0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 35374 1726882915.47957: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 35374 1726882915.48036: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e277b3790> <<< 35374 1726882915.48067: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 35374 1726882915.48092: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 35374 1726882915.48105: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 35374 1726882915.48141: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 35374 1726882915.48204: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py <<< 35374 1726882915.48207: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 35374 1726882915.48333: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e276a47f0> <<< 35374 1726882915.48401: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e276a4b80><<< 35374 1726882915.48404: stdout chunk (state=3): >>> <<< 35374 1726882915.48458: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so'<<< 35374 1726882915.48473: stdout chunk (state=3): >>> import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e276a49d0> <<< 35374 1726882915.48503: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 35374 1726882915.48558: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 35374 1726882915.48615: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e276c3af0> <<< 35374 1726882915.48646: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e277add60> <<< 35374 1726882915.48937: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e277b34f0> <<< 35374 1726882915.48976: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py <<< 35374 1726882915.48994: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 35374 1726882915.49017: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e277ad1c0> <<< 35374 1726882915.49055: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 35374 1726882915.49080: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 35374 1726882915.49125: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc'<<< 35374 1726882915.49137: stdout chunk (state=3): >>> <<< 35374 1726882915.49168: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 35374 1726882915.49191: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 35374 1726882915.49238: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py <<< 35374 1726882915.49261: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e2771fb20> <<< 35374 1726882915.49398: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27756eb0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e277568b0> <<< 35374 1726882915.49425: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e276bd2e0> <<< 35374 1726882915.49457: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so'<<< 35374 1726882915.49494: stdout chunk (state=3): >>> # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e277569a0> <<< 35374 1726882915.49551: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py<<< 35374 1726882915.49571: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27784d00> <<< 35374 1726882915.49609: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 35374 1726882915.49640: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc'<<< 35374 1726882915.49669: stdout chunk (state=3): >>> <<< 35374 1726882915.49682: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 35374 1726882915.49728: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 35374 1726882915.49852: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e27686a00><<< 35374 1726882915.49885: stdout chunk (state=3): >>> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e2778ce80> <<< 35374 1726882915.49908: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 35374 1726882915.49923: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 35374 1726882915.50017: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' <<< 35374 1726882915.50029: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e276940a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e2778ceb0> <<< 35374 1726882915.50057: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 35374 1726882915.50121: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 35374 1726882915.50171: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py <<< 35374 1726882915.50184: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 35374 1726882915.50285: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27759730> <<< 35374 1726882915.50512: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e276940d0> <<< 35374 1726882915.50667: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' <<< 35374 1726882915.50680: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e27691550> <<< 35374 1726882915.50724: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' <<< 35374 1726882915.50750: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e27691610> <<< 35374 1726882915.50824: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' <<< 35374 1726882915.50845: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e27690c40> <<< 35374 1726882915.50861: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27784ee0> <<< 35374 1726882915.50898: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py <<< 35374 1726882915.50910: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 35374 1726882915.50940: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 35374 1726882915.50974: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 35374 1726882915.51052: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' <<< 35374 1726882915.51070: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e27715b50> <<< 35374 1726882915.51388: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' <<< 35374 1726882915.51422: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e27713940> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27687820> <<< 35374 1726882915.51486: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' <<< 35374 1726882915.51527: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e277155b0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e2774daf0> <<< 35374 1726882915.51542: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882915.51580: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882915.51594: stdout chunk (state=3): >>>import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py <<< 35374 1726882915.51625: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882915.51756: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882915.51905: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882915.51912: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882915.51915: stdout chunk (state=3): >>>import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py <<< 35374 1726882915.51971: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882915.51974: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882915.51977: stdout chunk (state=3): >>>import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py<<< 35374 1726882915.51979: stdout chunk (state=3): >>> <<< 35374 1726882915.52008: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882915.52175: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882915.52330: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882915.53128: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882915.53901: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py <<< 35374 1726882915.53943: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 35374 1726882915.53946: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # <<< 35374 1726882915.53948: stdout chunk (state=3): >>>import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py <<< 35374 1726882915.53986: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py<<< 35374 1726882915.53991: stdout chunk (state=3): >>> <<< 35374 1726882915.54015: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc'<<< 35374 1726882915.54018: stdout chunk (state=3): >>> <<< 35374 1726882915.54117: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so'<<< 35374 1726882915.54120: stdout chunk (state=3): >>> import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e2726fdf0><<< 35374 1726882915.54123: stdout chunk (state=3): >>> <<< 35374 1726882915.54234: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc'<<< 35374 1726882915.54239: stdout chunk (state=3): >>> <<< 35374 1726882915.54258: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e276615b0> <<< 35374 1726882915.54285: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27652df0><<< 35374 1726882915.54289: stdout chunk (state=3): >>> <<< 35374 1726882915.54354: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py<<< 35374 1726882915.54360: stdout chunk (state=3): >>> <<< 35374 1726882915.54379: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882915.54413: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882915.54430: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/_text.py <<< 35374 1726882915.54457: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882915.54660: stdout chunk (state=3): >>># zipimport: zlib available<<< 35374 1726882915.54664: stdout chunk (state=3): >>> <<< 35374 1726882915.54882: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc'<<< 35374 1726882915.54885: stdout chunk (state=3): >>> <<< 35374 1726882915.54917: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e2770a9d0> <<< 35374 1726882915.54934: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882915.55586: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882915.56199: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882915.56299: stdout chunk (state=3): >>># zipimport: zlib available<<< 35374 1726882915.56303: stdout chunk (state=3): >>> <<< 35374 1726882915.56413: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/common/collections.py<<< 35374 1726882915.56416: stdout chunk (state=3): >>> <<< 35374 1726882915.56419: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882915.56481: stdout chunk (state=3): >>># zipimport: zlib available<<< 35374 1726882915.56485: stdout chunk (state=3): >>> <<< 35374 1726882915.56540: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py<<< 35374 1726882915.56543: stdout chunk (state=3): >>> <<< 35374 1726882915.56545: stdout chunk (state=3): >>># zipimport: zlib available<<< 35374 1726882915.56548: stdout chunk (state=3): >>> <<< 35374 1726882915.56642: stdout chunk (state=3): >>># zipimport: zlib available<<< 35374 1726882915.56645: stdout chunk (state=3): >>> <<< 35374 1726882915.56759: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/errors.py <<< 35374 1726882915.56827: stdout chunk (state=3): >>># zipimport: zlib available<<< 35374 1726882915.56830: stdout chunk (state=3): >>> # zipimport: zlib available<<< 35374 1726882915.56833: stdout chunk (state=3): >>> import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py<<< 35374 1726882915.56836: stdout chunk (state=3): >>> <<< 35374 1726882915.56848: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882915.56901: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882915.56952: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 35374 1726882915.56987: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882915.57296: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882915.57605: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 35374 1726882915.57646: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 35374 1726882915.57663: stdout chunk (state=3): >>>import '_ast' # <<< 35374 1726882915.57794: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27242e50><<< 35374 1726882915.57797: stdout chunk (state=3): >>> <<< 35374 1726882915.57803: stdout chunk (state=3): >>># zipimport: zlib available<<< 35374 1726882915.57806: stdout chunk (state=3): >>> <<< 35374 1726882915.57893: stdout chunk (state=3): >>># zipimport: zlib available<<< 35374 1726882915.57895: stdout chunk (state=3): >>> <<< 35374 1726882915.58016: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py<<< 35374 1726882915.58020: stdout chunk (state=3): >>> <<< 35374 1726882915.58023: stdout chunk (state=3): >>>import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/common/validation.py <<< 35374 1726882915.58029: stdout chunk (state=3): >>>import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py <<< 35374 1726882915.58049: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py <<< 35374 1726882915.58086: stdout chunk (state=3): >>># zipimport: zlib available<<< 35374 1726882915.58088: stdout chunk (state=3): >>> <<< 35374 1726882915.58150: stdout chunk (state=3): >>># zipimport: zlib available<<< 35374 1726882915.58153: stdout chunk (state=3): >>> <<< 35374 1726882915.58203: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/common/locale.py<<< 35374 1726882915.58208: stdout chunk (state=3): >>> <<< 35374 1726882915.58229: stdout chunk (state=3): >>># zipimport: zlib available<<< 35374 1726882915.58233: stdout chunk (state=3): >>> <<< 35374 1726882915.58290: stdout chunk (state=3): >>># zipimport: zlib available<<< 35374 1726882915.58295: stdout chunk (state=3): >>> <<< 35374 1726882915.58352: stdout chunk (state=3): >>># zipimport: zlib available<<< 35374 1726882915.58357: stdout chunk (state=3): >>> <<< 35374 1726882915.58490: stdout chunk (state=3): >>># zipimport: zlib available<<< 35374 1726882915.58496: stdout chunk (state=3): >>> <<< 35374 1726882915.58587: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py<<< 35374 1726882915.58591: stdout chunk (state=3): >>> <<< 35374 1726882915.58629: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc'<<< 35374 1726882915.58634: stdout chunk (state=3): >>> <<< 35374 1726882915.58753: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' <<< 35374 1726882915.58758: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' <<< 35374 1726882915.58762: stdout chunk (state=3): >>>import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e2779e910> <<< 35374 1726882915.58811: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27242be0><<< 35374 1726882915.58818: stdout chunk (state=3): >>> <<< 35374 1726882915.58874: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/common/file.py<<< 35374 1726882915.58877: stdout chunk (state=3): >>> import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/common/process.py<<< 35374 1726882915.58879: stdout chunk (state=3): >>> <<< 35374 1726882915.58899: stdout chunk (state=3): >>># zipimport: zlib available<<< 35374 1726882915.58903: stdout chunk (state=3): >>> <<< 35374 1726882915.59105: stdout chunk (state=3): >>># zipimport: zlib available<<< 35374 1726882915.59110: stdout chunk (state=3): >>> <<< 35374 1726882915.59196: stdout chunk (state=3): >>># zipimport: zlib available<<< 35374 1726882915.59201: stdout chunk (state=3): >>> <<< 35374 1726882915.59239: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882915.59293: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py<<< 35374 1726882915.59299: stdout chunk (state=3): >>> <<< 35374 1726882915.59319: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc'<<< 35374 1726882915.59324: stdout chunk (state=3): >>> <<< 35374 1726882915.59350: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py<<< 35374 1726882915.59357: stdout chunk (state=3): >>> <<< 35374 1726882915.59401: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc'<<< 35374 1726882915.59406: stdout chunk (state=3): >>> <<< 35374 1726882915.59432: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py<<< 35374 1726882915.59436: stdout chunk (state=3): >>> <<< 35374 1726882915.59477: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 35374 1726882915.59613: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27203c70> <<< 35374 1726882915.59700: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27654670> <<< 35374 1726882915.59775: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27653850> # destroy ansible.module_utils.distro<<< 35374 1726882915.59809: stdout chunk (state=3): >>> import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available<<< 35374 1726882915.59814: stdout chunk (state=3): >>> <<< 35374 1726882915.59845: stdout chunk (state=3): >>># zipimport: zlib available<<< 35374 1726882915.59848: stdout chunk (state=3): >>> <<< 35374 1726882915.59898: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py <<< 35374 1726882915.60028: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/basic.py <<< 35374 1726882915.60067: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 35374 1726882915.60093: stdout chunk (state=3): >>>import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/modules/__init__.py <<< 35374 1726882915.60107: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882915.60295: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882915.60578: stdout chunk (state=3): >>># zipimport: zlib available <<< 35374 1726882915.60751: stdout chunk (state=3): >>> <<< 35374 1726882915.60753: stdout chunk (state=3): >>>{"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 35374 1726882915.60780: stdout chunk (state=3): >>># destroy __main__ <<< 35374 1726882915.61148: stdout chunk (state=3): >>># clear builtins._<<< 35374 1726882915.61161: stdout chunk (state=3): >>> # clear sys.path <<< 35374 1726882915.61187: stdout chunk (state=3): >>># clear sys.argv # clear sys.ps1 <<< 35374 1726882915.61197: stdout chunk (state=3): >>># clear sys.ps2 # clear sys.last_type <<< 35374 1726882915.61232: stdout chunk (state=3): >>># clear sys.last_value # clear sys.last_traceback <<< 35374 1726882915.61241: stdout chunk (state=3): >>># clear sys.path_hooks # clear sys.path_importer_cache<<< 35374 1726882915.61260: stdout chunk (state=3): >>> # clear sys.meta_path<<< 35374 1726882915.61289: stdout chunk (state=3): >>> # clear sys.__interactivehook__<<< 35374 1726882915.61304: stdout chunk (state=3): >>> # restore sys.stdin <<< 35374 1726882915.61317: stdout chunk (state=3): >>># restore sys.stdout<<< 35374 1726882915.61331: stdout chunk (state=3): >>> # restore sys.stderr<<< 35374 1726882915.61339: stdout chunk (state=3): >>> # cleanup[2] removing sys <<< 35374 1726882915.61343: stdout chunk (state=3): >>># cleanup[2] removing builtins <<< 35374 1726882915.61348: stdout chunk (state=3): >>># cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp<<< 35374 1726882915.61351: stdout chunk (state=3): >>> # cleanup[2] removing _thread <<< 35374 1726882915.61356: stdout chunk (state=3): >>># cleanup[2] removing _warnings # cleanup[2] removing _weakref<<< 35374 1726882915.61413: stdout chunk (state=3): >>> # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external<<< 35374 1726882915.61417: stdout chunk (state=3): >>> # cleanup[2] removing time <<< 35374 1726882915.61426: stdout chunk (state=3): >>># cleanup[2] removing zipimport # cleanup[2] removing _codecs<<< 35374 1726882915.61429: stdout chunk (state=3): >>> # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases<<< 35374 1726882915.61438: stdout chunk (state=3): >>> <<< 35374 1726882915.61441: stdout chunk (state=3): >>># cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8<<< 35374 1726882915.61476: stdout chunk (state=3): >>> # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1<<< 35374 1726882915.61487: stdout chunk (state=3): >>> # cleanup[2] removing _abc # cleanup[2] removing abc <<< 35374 1726882915.61515: stdout chunk (state=3): >>># cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat<<< 35374 1726882915.61526: stdout chunk (state=3): >>> # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath<<< 35374 1726882915.61533: stdout chunk (state=3): >>> # cleanup[2] removing posixpath # cleanup[2] removing os.path<<< 35374 1726882915.61578: stdout chunk (state=3): >>> <<< 35374 1726882915.61598: stdout chunk (state=3): >>># cleanup[2] removing os <<< 35374 1726882915.61617: stdout chunk (state=3): >>># cleanup[2] removing _sitebuiltins <<< 35374 1726882915.61653: stdout chunk (state=3): >>># cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale<<< 35374 1726882915.61673: stdout chunk (state=3): >>> # cleanup[2] removing site # destroy site<<< 35374 1726882915.61725: stdout chunk (state=3): >>> # cleanup[2] removing types<<< 35374 1726882915.61748: stdout chunk (state=3): >>> # cleanup[2] removing enum # cleanup[2] removing _sre <<< 35374 1726882915.61764: stdout chunk (state=3): >>># cleanup[2] removing sre_constants <<< 35374 1726882915.61789: stdout chunk (state=3): >>># destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile<<< 35374 1726882915.61831: stdout chunk (state=3): >>> # cleanup[2] removing _heapq # cleanup[2] removing heapq<<< 35374 1726882915.61850: stdout chunk (state=3): >>> # cleanup[2] removing itertools # cleanup[2] removing keyword <<< 35374 1726882915.61876: stdout chunk (state=3): >>># destroy keyword <<< 35374 1726882915.61894: stdout chunk (state=3): >>># cleanup[2] removing _operator # cleanup[2] removing operator <<< 35374 1726882915.61907: stdout chunk (state=3): >>># cleanup[2] removing reprlib <<< 35374 1726882915.61935: stdout chunk (state=3): >>># destroy reprlib # cleanup[2] removing _collections <<< 35374 1726882915.61954: stdout chunk (state=3): >>># cleanup[2] removing collections # cleanup[2] removing _functools <<< 35374 1726882915.61974: stdout chunk (state=3): >>># cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re<<< 35374 1726882915.61986: stdout chunk (state=3): >>> # cleanup[2] removing _struct # cleanup[2] removing struct <<< 35374 1726882915.62006: stdout chunk (state=3): >>># cleanup[2] removing binascii <<< 35374 1726882915.62023: stdout chunk (state=3): >>># cleanup[2] removing base64 <<< 35374 1726882915.62056: stdout chunk (state=3): >>># destroy base64 # cleanup[2] removing importlib._bootstrap<<< 35374 1726882915.62075: stdout chunk (state=3): >>> # cleanup[2] removing importlib._bootstrap_external<<< 35374 1726882915.62090: stdout chunk (state=3): >>> # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery <<< 35374 1726882915.62114: stdout chunk (state=3): >>># cleanup[2] removing collections.abc # cleanup[2] removing contextlib<<< 35374 1726882915.62128: stdout chunk (state=3): >>> # cleanup[2] removing typing # destroy typing<<< 35374 1726882915.62132: stdout chunk (state=3): >>> # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util<<< 35374 1726882915.62167: stdout chunk (state=3): >>> # cleanup[2] removing _weakrefset # destroy _weakrefset<<< 35374 1726882915.62188: stdout chunk (state=3): >>> # cleanup[2] removing weakref<<< 35374 1726882915.62199: stdout chunk (state=3): >>> # cleanup[2] removing pkgutil # destroy pkgutil <<< 35374 1726882915.62208: stdout chunk (state=3): >>># cleanup[2] removing runpy <<< 35374 1726882915.62212: stdout chunk (state=3): >>># destroy runpy <<< 35374 1726882915.62215: stdout chunk (state=3): >>># cleanup[2] removing fnmatch # cleanup[2] removing errno<<< 35374 1726882915.62218: stdout chunk (state=3): >>> # cleanup[2] removing zlib <<< 35374 1726882915.62220: stdout chunk (state=3): >>># cleanup[2] removing _compression # cleanup[2] removing threading<<< 35374 1726882915.62222: stdout chunk (state=3): >>> # cleanup[2] removing _bz2<<< 35374 1726882915.62238: stdout chunk (state=3): >>> # destroy _bz2 <<< 35374 1726882915.62240: stdout chunk (state=3): >>># cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma<<< 35374 1726882915.62241: stdout chunk (state=3): >>> # cleanup[2] removing pwd<<< 35374 1726882915.62242: stdout chunk (state=3): >>> # cleanup[2] removing grp <<< 35374 1726882915.62244: stdout chunk (state=3): >>># cleanup[2] removing shutil <<< 35374 1726882915.62245: stdout chunk (state=3): >>># cleanup[2] removing math # cleanup[2] removing _bisect<<< 35374 1726882915.62246: stdout chunk (state=3): >>> # cleanup[2] removing bisect <<< 35374 1726882915.62250: stdout chunk (state=3): >>># destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random <<< 35374 1726882915.62251: stdout chunk (state=3): >>># cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible<<< 35374 1726882915.62272: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess<<< 35374 1726882915.62298: stdout chunk (state=3): >>> # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader<<< 35374 1726882915.62316: stdout chunk (state=3): >>> # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common<<< 35374 1726882915.62338: stdout chunk (state=3): >>> # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux <<< 35374 1726882915.62356: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast<<< 35374 1726882915.62398: stdout chunk (state=3): >>> # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4<<< 35374 1726882915.62400: stdout chunk (state=3): >>> # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils<<< 35374 1726882915.62702: stdout chunk (state=3): >>> # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins <<< 35374 1726882915.62718: stdout chunk (state=3): >>># destroy importlib.util <<< 35374 1726882915.62721: stdout chunk (state=3): >>># destroy importlib.abc # destroy importlib.machinery <<< 35374 1726882915.62762: stdout chunk (state=3): >>># destroy zipimport<<< 35374 1726882915.62799: stdout chunk (state=3): >>> <<< 35374 1726882915.62808: stdout chunk (state=3): >>># destroy _compression <<< 35374 1726882915.62847: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma <<< 35374 1726882915.62927: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux<<< 35374 1726882915.62952: stdout chunk (state=3): >>> # destroy hashlib # destroy json.decoder # destroy json.encoder<<< 35374 1726882915.62978: stdout chunk (state=3): >>> # destroy json.scanner # destroy _json # destroy encodings <<< 35374 1726882915.63034: stdout chunk (state=3): >>># destroy syslog <<< 35374 1726882915.63063: stdout chunk (state=3): >>># destroy uuid <<< 35374 1726882915.63118: stdout chunk (state=3): >>># destroy array # destroy datetime <<< 35374 1726882915.63141: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse <<< 35374 1726882915.63232: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux <<< 35374 1726882915.63316: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket<<< 35374 1726882915.63381: stdout chunk (state=3): >>> # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize <<< 35374 1726882915.63442: stdout chunk (state=3): >>># cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess <<< 35374 1726882915.63510: stdout chunk (state=3): >>># cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 <<< 35374 1726882915.63529: stdout chunk (state=3): >>># cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math <<< 35374 1726882915.63597: stdout chunk (state=3): >>># cleanup[3] wiping shutil # destroy fnmatch <<< 35374 1726882915.63643: stdout chunk (state=3): >>># cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref<<< 35374 1726882915.63656: stdout chunk (state=3): >>> # cleanup[3] wiping contextlib <<< 35374 1726882915.63701: stdout chunk (state=3): >>># cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap<<< 35374 1726882915.63709: stdout chunk (state=3): >>> # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile<<< 35374 1726882915.63735: stdout chunk (state=3): >>> # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator <<< 35374 1726882915.63767: stdout chunk (state=3): >>># cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types<<< 35374 1726882915.63792: stdout chunk (state=3): >>> # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os<<< 35374 1726882915.63823: stdout chunk (state=3): >>> # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 <<< 35374 1726882915.63841: stdout chunk (state=3): >>># cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal<<< 35374 1726882915.63876: stdout chunk (state=3): >>> # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys<<< 35374 1726882915.63900: stdout chunk (state=3): >>> # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128<<< 35374 1726882915.63913: stdout chunk (state=3): >>> # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl<<< 35374 1726882915.63958: stdout chunk (state=3): >>> # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 35374 1726882915.64104: stdout chunk (state=3): >>># destroy platform # destroy _uuid<<< 35374 1726882915.64160: stdout chunk (state=3): >>> # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath <<< 35374 1726882915.64215: stdout chunk (state=3): >>># destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno <<< 35374 1726882915.64262: stdout chunk (state=3): >>># destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 35374 1726882915.64329: stdout chunk (state=3): >>># destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request <<< 35374 1726882915.64404: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves <<< 35374 1726882915.64423: stdout chunk (state=3): >>># destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 35374 1726882915.64506: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 35374 1726882915.64958: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 35374 1726882915.64995: stderr chunk (state=3): >>><<< 35374 1726882915.65004: stdout chunk (state=3): >>><<< 35374 1726882915.65069: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e282d8dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e2827d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e282d8b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e282d8ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e2827d490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e2827d940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e2827d670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e28234190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e28234220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e28257850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e28234940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e28295880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e2822dd90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e28257d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e2827d970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27faeeb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27fb1f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27fa7610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27fad640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27fae370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e27f2fe20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27f2f910> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27f2ff10> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27f2ffd0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27f420d0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27f89d90> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27f82670> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27f956d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27fb5e20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e27f42cd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27f892b0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e27f952e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27fbb9d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27f42eb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27f42df0> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27f42d60> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27f153d0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27f154c0> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27f49f40> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27f44a90> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27f44490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27e49220> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27f00520> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27f44f10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27fbb040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27e5bb50> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e27e5be80> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27e6c790> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27e6ccd0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e27e05400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27e5bf70> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e27e162e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27e6c610> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e27e163a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27f42a30> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e27e31700> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e27e319d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27e317c0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e27e318b0> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e27e31d00> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e27e3c250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27e31940> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27e25a90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27f42610> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27e31af0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f6e27d5a6d0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27724820> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e277b3730> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e277b3610> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e277b3340> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e277b3460> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e277b3160> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e277b33a0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e277b3790> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e276a47f0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e276a4b80> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e276a49d0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e276c3af0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e277add60> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e277b34f0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e277ad1c0> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e2771fb20> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27756eb0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e277568b0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e276bd2e0> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e277569a0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27784d00> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e27686a00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e2778ce80> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e276940a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e2778ceb0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27759730> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e276940d0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e27691550> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e27691610> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e27690c40> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27784ee0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e27715b50> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e27713940> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27687820> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e277155b0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e2774daf0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e2726fdf0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e276615b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27652df0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e2770a9d0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27242e50> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6e2779e910> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27242be0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27203c70> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27654670> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6e27653850> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_ae3ndi2k/ansible_stat_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 35374 1726882915.65629: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882915.1283703-35477-247817537395111/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 35374 1726882915.65632: _low_level_execute_command(): starting 35374 1726882915.65635: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882915.1283703-35477-247817537395111/ > /dev/null 2>&1 && sleep 0' 35374 1726882915.65723: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 35374 1726882915.65733: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 35374 1726882915.65749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 35374 1726882915.65753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 35374 1726882915.65785: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 35374 1726882915.65814: stderr chunk (state=3): >>>debug2: match not found <<< 35374 1726882915.65817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 35374 1726882915.65819: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 35374 1726882915.65821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 35374 1726882915.65824: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 35374 1726882915.65888: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 35374 1726882915.65892: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 35374 1726882915.65897: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 35374 1726882915.65993: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 35374 1726882915.68729: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 35374 1726882915.68780: stderr chunk (state=3): >>><<< 35374 1726882915.68783: stdout chunk (state=3): >>><<< 35374 1726882915.68796: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 35374 1726882915.68801: handler run complete 35374 1726882915.68819: attempt loop complete, returning result 35374 1726882915.68823: _execute() done 35374 1726882915.68827: dumping result to json 35374 1726882915.68829: done dumping result, returning 35374 1726882915.68837: done running TaskExecutor() for managed_node1/TASK: Check if system is ostree [0e448fcc-3ce9-ee6a-9b8c-00000000015a] 35374 1726882915.68839: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000015a 35374 1726882915.68928: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000015a 35374 1726882915.68931: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 35374 1726882915.68990: no more pending results, returning what we have 35374 1726882915.68993: results queue empty 35374 1726882915.68994: checking for any_errors_fatal 35374 1726882915.68998: done checking for any_errors_fatal 35374 1726882915.68999: checking for max_fail_percentage 35374 1726882915.69000: done checking for max_fail_percentage 35374 1726882915.69001: checking to see if all hosts have failed and the running result is not ok 35374 1726882915.69002: done checking to see if all hosts have failed 35374 1726882915.69002: getting the remaining hosts for this loop 35374 1726882915.69004: done getting the remaining hosts for this loop 35374 1726882915.69007: getting the next task for host managed_node1 35374 1726882915.69012: done getting next task for host managed_node1 35374 1726882915.69014: ^ task is: TASK: Set flag to indicate system is ostree 35374 1726882915.69017: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882915.69020: getting variables 35374 1726882915.69021: in VariableManager get_vars() 35374 1726882915.69049: Calling all_inventory to load vars for managed_node1 35374 1726882915.69052: Calling groups_inventory to load vars for managed_node1 35374 1726882915.69055: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882915.69067: Calling all_plugins_play to load vars for managed_node1 35374 1726882915.69069: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882915.69072: Calling groups_plugins_play to load vars for managed_node1 35374 1726882915.69236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882915.69354: done with get_vars() 35374 1726882915.69361: done getting variables 35374 1726882915.69435: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 21:41:55 -0400 (0:00:00.626) 0:00:02.273 ****** 35374 1726882915.69455: entering _queue_task() for managed_node1/set_fact 35374 1726882915.69456: Creating lock for set_fact 35374 1726882915.69648: worker is 1 (out of 1 available) 35374 1726882915.69662: exiting _queue_task() for managed_node1/set_fact 35374 1726882915.69679: done queuing things up, now waiting for results queue to drain 35374 1726882915.69681: waiting for pending results... 35374 1726882915.69826: running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree 35374 1726882915.69895: in run() - task 0e448fcc-3ce9-ee6a-9b8c-00000000015b 35374 1726882915.69907: variable 'ansible_search_path' from source: unknown 35374 1726882915.69910: variable 'ansible_search_path' from source: unknown 35374 1726882915.69944: calling self._execute() 35374 1726882915.69995: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882915.70000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882915.70009: variable 'omit' from source: magic vars 35374 1726882915.70413: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 35374 1726882915.70650: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 35374 1726882915.70744: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 35374 1726882915.70788: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 35374 1726882915.70834: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 35374 1726882915.70921: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 35374 1726882915.70958: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 35374 1726882915.70996: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 35374 1726882915.71050: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 35374 1726882915.71307: Evaluated conditional (not __network_is_ostree is defined): True 35374 1726882915.71313: variable 'omit' from source: magic vars 35374 1726882915.71345: variable 'omit' from source: magic vars 35374 1726882915.71454: variable '__ostree_booted_stat' from source: set_fact 35374 1726882915.71499: variable 'omit' from source: magic vars 35374 1726882915.71517: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 35374 1726882915.71536: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 35374 1726882915.71552: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 35374 1726882915.71567: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 35374 1726882915.71578: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 35374 1726882915.71602: variable 'inventory_hostname' from source: host vars for 'managed_node1' 35374 1726882915.71605: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882915.71608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882915.71676: Set connection var ansible_shell_type to sh 35374 1726882915.71684: Set connection var ansible_shell_executable to /bin/sh 35374 1726882915.71689: Set connection var ansible_pipelining to False 35374 1726882915.71701: Set connection var ansible_timeout to 10 35374 1726882915.71704: Set connection var ansible_module_compression to ZIP_DEFLATED 35374 1726882915.71706: Set connection var ansible_connection to ssh 35374 1726882915.71722: variable 'ansible_shell_executable' from source: unknown 35374 1726882915.71725: variable 'ansible_connection' from source: unknown 35374 1726882915.71728: variable 'ansible_module_compression' from source: unknown 35374 1726882915.71730: variable 'ansible_shell_type' from source: unknown 35374 1726882915.71732: variable 'ansible_shell_executable' from source: unknown 35374 1726882915.71734: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882915.71737: variable 'ansible_pipelining' from source: unknown 35374 1726882915.71741: variable 'ansible_timeout' from source: unknown 35374 1726882915.71745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882915.71816: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 35374 1726882915.71825: variable 'omit' from source: magic vars 35374 1726882915.71830: starting attempt loop 35374 1726882915.71833: running the handler 35374 1726882915.71841: handler run complete 35374 1726882915.71849: attempt loop complete, returning result 35374 1726882915.71851: _execute() done 35374 1726882915.71853: dumping result to json 35374 1726882915.71856: done dumping result, returning 35374 1726882915.71862: done running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree [0e448fcc-3ce9-ee6a-9b8c-00000000015b] 35374 1726882915.71868: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000015b 35374 1726882915.71944: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000015b 35374 1726882915.71947: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 35374 1726882915.72036: no more pending results, returning what we have 35374 1726882915.72039: results queue empty 35374 1726882915.72039: checking for any_errors_fatal 35374 1726882915.72043: done checking for any_errors_fatal 35374 1726882915.72044: checking for max_fail_percentage 35374 1726882915.72045: done checking for max_fail_percentage 35374 1726882915.72046: checking to see if all hosts have failed and the running result is not ok 35374 1726882915.72047: done checking to see if all hosts have failed 35374 1726882915.72047: getting the remaining hosts for this loop 35374 1726882915.72048: done getting the remaining hosts for this loop 35374 1726882915.72051: getting the next task for host managed_node1 35374 1726882915.72060: done getting next task for host managed_node1 35374 1726882915.72062: ^ task is: TASK: Fix CentOS6 Base repo 35374 1726882915.72064: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882915.72069: getting variables 35374 1726882915.72070: in VariableManager get_vars() 35374 1726882915.72092: Calling all_inventory to load vars for managed_node1 35374 1726882915.72093: Calling groups_inventory to load vars for managed_node1 35374 1726882915.72096: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882915.72102: Calling all_plugins_play to load vars for managed_node1 35374 1726882915.72103: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882915.72109: Calling groups_plugins_play to load vars for managed_node1 35374 1726882915.72209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882915.72322: done with get_vars() 35374 1726882915.72328: done getting variables 35374 1726882915.72413: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 21:41:55 -0400 (0:00:00.029) 0:00:02.303 ****** 35374 1726882915.72432: entering _queue_task() for managed_node1/copy 35374 1726882915.72599: worker is 1 (out of 1 available) 35374 1726882915.72611: exiting _queue_task() for managed_node1/copy 35374 1726882915.72622: done queuing things up, now waiting for results queue to drain 35374 1726882915.72623: waiting for pending results... 35374 1726882915.72775: running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo 35374 1726882915.72829: in run() - task 0e448fcc-3ce9-ee6a-9b8c-00000000015d 35374 1726882915.72848: variable 'ansible_search_path' from source: unknown 35374 1726882915.72851: variable 'ansible_search_path' from source: unknown 35374 1726882915.72878: calling self._execute() 35374 1726882915.72987: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882915.72991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882915.72999: variable 'omit' from source: magic vars 35374 1726882915.73305: variable 'ansible_distribution' from source: facts 35374 1726882915.73320: Evaluated conditional (ansible_distribution == 'CentOS'): True 35374 1726882915.73400: variable 'ansible_distribution_major_version' from source: facts 35374 1726882915.73404: Evaluated conditional (ansible_distribution_major_version == '6'): False 35374 1726882915.73406: when evaluation is False, skipping this task 35374 1726882915.73409: _execute() done 35374 1726882915.73411: dumping result to json 35374 1726882915.73414: done dumping result, returning 35374 1726882915.73421: done running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo [0e448fcc-3ce9-ee6a-9b8c-00000000015d] 35374 1726882915.73426: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000015d 35374 1726882915.73518: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000015d 35374 1726882915.73521: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 35374 1726882915.73619: no more pending results, returning what we have 35374 1726882915.73621: results queue empty 35374 1726882915.73622: checking for any_errors_fatal 35374 1726882915.73623: done checking for any_errors_fatal 35374 1726882915.73623: checking for max_fail_percentage 35374 1726882915.73625: done checking for max_fail_percentage 35374 1726882915.73625: checking to see if all hosts have failed and the running result is not ok 35374 1726882915.73626: done checking to see if all hosts have failed 35374 1726882915.73626: getting the remaining hosts for this loop 35374 1726882915.73627: done getting the remaining hosts for this loop 35374 1726882915.73629: getting the next task for host managed_node1 35374 1726882915.73632: done getting next task for host managed_node1 35374 1726882915.73634: ^ task is: TASK: Include the task 'enable_epel.yml' 35374 1726882915.73636: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882915.73638: getting variables 35374 1726882915.73639: in VariableManager get_vars() 35374 1726882915.73652: Calling all_inventory to load vars for managed_node1 35374 1726882915.73654: Calling groups_inventory to load vars for managed_node1 35374 1726882915.73656: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882915.73661: Calling all_plugins_play to load vars for managed_node1 35374 1726882915.73665: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882915.73667: Calling groups_plugins_play to load vars for managed_node1 35374 1726882915.73758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882915.73876: done with get_vars() 35374 1726882915.73882: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 21:41:55 -0400 (0:00:00.015) 0:00:02.318 ****** 35374 1726882915.73941: entering _queue_task() for managed_node1/include_tasks 35374 1726882915.74089: worker is 1 (out of 1 available) 35374 1726882915.74100: exiting _queue_task() for managed_node1/include_tasks 35374 1726882915.74111: done queuing things up, now waiting for results queue to drain 35374 1726882915.74112: waiting for pending results... 35374 1726882915.74254: running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' 35374 1726882915.74318: in run() - task 0e448fcc-3ce9-ee6a-9b8c-00000000015e 35374 1726882915.74326: variable 'ansible_search_path' from source: unknown 35374 1726882915.74329: variable 'ansible_search_path' from source: unknown 35374 1726882915.74357: calling self._execute() 35374 1726882915.74414: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882915.74418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882915.74426: variable 'omit' from source: magic vars 35374 1726882915.74751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 35374 1726882915.76285: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 35374 1726882915.76328: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 35374 1726882915.76485: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 35374 1726882915.76511: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 35374 1726882915.76530: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 35374 1726882915.76590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 35374 1726882915.76609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 35374 1726882915.76628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 35374 1726882915.76655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 35374 1726882915.76671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 35374 1726882915.76747: variable '__network_is_ostree' from source: set_fact 35374 1726882915.76761: Evaluated conditional (not __network_is_ostree | d(false)): True 35374 1726882915.76765: _execute() done 35374 1726882915.76774: dumping result to json 35374 1726882915.76777: done dumping result, returning 35374 1726882915.76780: done running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' [0e448fcc-3ce9-ee6a-9b8c-00000000015e] 35374 1726882915.76786: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000015e 35374 1726882915.76862: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000015e 35374 1726882915.76867: WORKER PROCESS EXITING 35374 1726882915.76898: no more pending results, returning what we have 35374 1726882915.76904: in VariableManager get_vars() 35374 1726882915.76933: Calling all_inventory to load vars for managed_node1 35374 1726882915.76936: Calling groups_inventory to load vars for managed_node1 35374 1726882915.76939: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882915.76948: Calling all_plugins_play to load vars for managed_node1 35374 1726882915.76950: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882915.76953: Calling groups_plugins_play to load vars for managed_node1 35374 1726882915.77076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882915.77209: done with get_vars() 35374 1726882915.77214: variable 'ansible_search_path' from source: unknown 35374 1726882915.77215: variable 'ansible_search_path' from source: unknown 35374 1726882915.77238: we have included files to process 35374 1726882915.77239: generating all_blocks data 35374 1726882915.77240: done generating all_blocks data 35374 1726882915.77243: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 35374 1726882915.77244: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 35374 1726882915.77245: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 35374 1726882915.77693: done processing included file 35374 1726882915.77695: iterating over new_blocks loaded from include file 35374 1726882915.77695: in VariableManager get_vars() 35374 1726882915.77703: done with get_vars() 35374 1726882915.77704: filtering new block on tags 35374 1726882915.77717: done filtering new block on tags 35374 1726882915.77719: in VariableManager get_vars() 35374 1726882915.77725: done with get_vars() 35374 1726882915.77726: filtering new block on tags 35374 1726882915.77732: done filtering new block on tags 35374 1726882915.77733: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node1 35374 1726882915.77737: extending task lists for all hosts with included blocks 35374 1726882915.77800: done extending task lists 35374 1726882915.77802: done processing included files 35374 1726882915.77802: results queue empty 35374 1726882915.77803: checking for any_errors_fatal 35374 1726882915.77804: done checking for any_errors_fatal 35374 1726882915.77805: checking for max_fail_percentage 35374 1726882915.77805: done checking for max_fail_percentage 35374 1726882915.77806: checking to see if all hosts have failed and the running result is not ok 35374 1726882915.77806: done checking to see if all hosts have failed 35374 1726882915.77807: getting the remaining hosts for this loop 35374 1726882915.77808: done getting the remaining hosts for this loop 35374 1726882915.77809: getting the next task for host managed_node1 35374 1726882915.77812: done getting next task for host managed_node1 35374 1726882915.77813: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 35374 1726882915.77815: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882915.77816: getting variables 35374 1726882915.77817: in VariableManager get_vars() 35374 1726882915.77822: Calling all_inventory to load vars for managed_node1 35374 1726882915.77823: Calling groups_inventory to load vars for managed_node1 35374 1726882915.77825: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882915.77828: Calling all_plugins_play to load vars for managed_node1 35374 1726882915.77832: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882915.77834: Calling groups_plugins_play to load vars for managed_node1 35374 1726882915.77929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882915.78036: done with get_vars() 35374 1726882915.78042: done getting variables 35374 1726882915.78088: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 35374 1726882915.78214: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 9] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 21:41:55 -0400 (0:00:00.043) 0:00:02.361 ****** 35374 1726882915.78243: entering _queue_task() for managed_node1/command 35374 1726882915.78244: Creating lock for command 35374 1726882915.78417: worker is 1 (out of 1 available) 35374 1726882915.78430: exiting _queue_task() for managed_node1/command 35374 1726882915.78441: done queuing things up, now waiting for results queue to drain 35374 1726882915.78442: waiting for pending results... 35374 1726882915.78586: running TaskExecutor() for managed_node1/TASK: Create EPEL 9 35374 1726882915.78660: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000178 35374 1726882915.78678: variable 'ansible_search_path' from source: unknown 35374 1726882915.78683: variable 'ansible_search_path' from source: unknown 35374 1726882915.78704: calling self._execute() 35374 1726882915.78760: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882915.78765: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882915.78774: variable 'omit' from source: magic vars 35374 1726882915.79036: variable 'ansible_distribution' from source: facts 35374 1726882915.79044: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 35374 1726882915.79140: variable 'ansible_distribution_major_version' from source: facts 35374 1726882915.79143: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 35374 1726882915.79147: when evaluation is False, skipping this task 35374 1726882915.79150: _execute() done 35374 1726882915.79154: dumping result to json 35374 1726882915.79157: done dumping result, returning 35374 1726882915.79164: done running TaskExecutor() for managed_node1/TASK: Create EPEL 9 [0e448fcc-3ce9-ee6a-9b8c-000000000178] 35374 1726882915.79173: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000178 35374 1726882915.79260: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000178 35374 1726882915.79265: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 35374 1726882915.79317: no more pending results, returning what we have 35374 1726882915.79320: results queue empty 35374 1726882915.79321: checking for any_errors_fatal 35374 1726882915.79322: done checking for any_errors_fatal 35374 1726882915.79323: checking for max_fail_percentage 35374 1726882915.79324: done checking for max_fail_percentage 35374 1726882915.79325: checking to see if all hosts have failed and the running result is not ok 35374 1726882915.79326: done checking to see if all hosts have failed 35374 1726882915.79326: getting the remaining hosts for this loop 35374 1726882915.79328: done getting the remaining hosts for this loop 35374 1726882915.79331: getting the next task for host managed_node1 35374 1726882915.79335: done getting next task for host managed_node1 35374 1726882915.79338: ^ task is: TASK: Install yum-utils package 35374 1726882915.79341: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882915.79344: getting variables 35374 1726882915.79345: in VariableManager get_vars() 35374 1726882915.79365: Calling all_inventory to load vars for managed_node1 35374 1726882915.79366: Calling groups_inventory to load vars for managed_node1 35374 1726882915.79369: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882915.79377: Calling all_plugins_play to load vars for managed_node1 35374 1726882915.79379: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882915.79381: Calling groups_plugins_play to load vars for managed_node1 35374 1726882915.79482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882915.79616: done with get_vars() 35374 1726882915.79622: done getting variables 35374 1726882915.79685: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 21:41:55 -0400 (0:00:00.014) 0:00:02.375 ****** 35374 1726882915.79704: entering _queue_task() for managed_node1/package 35374 1726882915.79705: Creating lock for package 35374 1726882915.79868: worker is 1 (out of 1 available) 35374 1726882915.79881: exiting _queue_task() for managed_node1/package 35374 1726882915.79892: done queuing things up, now waiting for results queue to drain 35374 1726882915.79893: waiting for pending results... 35374 1726882915.80022: running TaskExecutor() for managed_node1/TASK: Install yum-utils package 35374 1726882915.80092: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000179 35374 1726882915.80101: variable 'ansible_search_path' from source: unknown 35374 1726882915.80104: variable 'ansible_search_path' from source: unknown 35374 1726882915.80128: calling self._execute() 35374 1726882915.80182: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882915.80186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882915.80193: variable 'omit' from source: magic vars 35374 1726882915.80438: variable 'ansible_distribution' from source: facts 35374 1726882915.80447: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 35374 1726882915.80535: variable 'ansible_distribution_major_version' from source: facts 35374 1726882915.80539: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 35374 1726882915.80542: when evaluation is False, skipping this task 35374 1726882915.80548: _execute() done 35374 1726882915.80551: dumping result to json 35374 1726882915.80553: done dumping result, returning 35374 1726882915.80560: done running TaskExecutor() for managed_node1/TASK: Install yum-utils package [0e448fcc-3ce9-ee6a-9b8c-000000000179] 35374 1726882915.80565: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000179 35374 1726882915.80647: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000179 35374 1726882915.80650: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 35374 1726882915.80709: no more pending results, returning what we have 35374 1726882915.80712: results queue empty 35374 1726882915.80712: checking for any_errors_fatal 35374 1726882915.80719: done checking for any_errors_fatal 35374 1726882915.80720: checking for max_fail_percentage 35374 1726882915.80721: done checking for max_fail_percentage 35374 1726882915.80722: checking to see if all hosts have failed and the running result is not ok 35374 1726882915.80723: done checking to see if all hosts have failed 35374 1726882915.80723: getting the remaining hosts for this loop 35374 1726882915.80724: done getting the remaining hosts for this loop 35374 1726882915.80727: getting the next task for host managed_node1 35374 1726882915.80732: done getting next task for host managed_node1 35374 1726882915.80733: ^ task is: TASK: Enable EPEL 7 35374 1726882915.80736: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882915.80738: getting variables 35374 1726882915.80739: in VariableManager get_vars() 35374 1726882915.80756: Calling all_inventory to load vars for managed_node1 35374 1726882915.80757: Calling groups_inventory to load vars for managed_node1 35374 1726882915.80761: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882915.80769: Calling all_plugins_play to load vars for managed_node1 35374 1726882915.80773: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882915.80775: Calling groups_plugins_play to load vars for managed_node1 35374 1726882915.80877: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882915.80991: done with get_vars() 35374 1726882915.80997: done getting variables 35374 1726882915.81035: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 21:41:55 -0400 (0:00:00.013) 0:00:02.389 ****** 35374 1726882915.81054: entering _queue_task() for managed_node1/command 35374 1726882915.81216: worker is 1 (out of 1 available) 35374 1726882915.81228: exiting _queue_task() for managed_node1/command 35374 1726882915.81239: done queuing things up, now waiting for results queue to drain 35374 1726882915.81240: waiting for pending results... 35374 1726882915.81379: running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 35374 1726882915.81448: in run() - task 0e448fcc-3ce9-ee6a-9b8c-00000000017a 35374 1726882915.81457: variable 'ansible_search_path' from source: unknown 35374 1726882915.81462: variable 'ansible_search_path' from source: unknown 35374 1726882915.81490: calling self._execute() 35374 1726882915.81537: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882915.81543: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882915.81551: variable 'omit' from source: magic vars 35374 1726882915.81805: variable 'ansible_distribution' from source: facts 35374 1726882915.81815: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 35374 1726882915.81903: variable 'ansible_distribution_major_version' from source: facts 35374 1726882915.81908: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 35374 1726882915.81911: when evaluation is False, skipping this task 35374 1726882915.81914: _execute() done 35374 1726882915.81916: dumping result to json 35374 1726882915.81920: done dumping result, returning 35374 1726882915.81925: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 [0e448fcc-3ce9-ee6a-9b8c-00000000017a] 35374 1726882915.81930: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000017a 35374 1726882915.82014: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000017a 35374 1726882915.82017: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 35374 1726882915.82061: no more pending results, returning what we have 35374 1726882915.82066: results queue empty 35374 1726882915.82067: checking for any_errors_fatal 35374 1726882915.82070: done checking for any_errors_fatal 35374 1726882915.82071: checking for max_fail_percentage 35374 1726882915.82072: done checking for max_fail_percentage 35374 1726882915.82073: checking to see if all hosts have failed and the running result is not ok 35374 1726882915.82074: done checking to see if all hosts have failed 35374 1726882915.82079: getting the remaining hosts for this loop 35374 1726882915.82080: done getting the remaining hosts for this loop 35374 1726882915.82083: getting the next task for host managed_node1 35374 1726882915.82088: done getting next task for host managed_node1 35374 1726882915.82090: ^ task is: TASK: Enable EPEL 8 35374 1726882915.82094: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882915.82097: getting variables 35374 1726882915.82098: in VariableManager get_vars() 35374 1726882915.82115: Calling all_inventory to load vars for managed_node1 35374 1726882915.82116: Calling groups_inventory to load vars for managed_node1 35374 1726882915.82118: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882915.82124: Calling all_plugins_play to load vars for managed_node1 35374 1726882915.82131: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882915.82133: Calling groups_plugins_play to load vars for managed_node1 35374 1726882915.82363: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882915.82475: done with get_vars() 35374 1726882915.82480: done getting variables 35374 1726882915.82516: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 21:41:55 -0400 (0:00:00.014) 0:00:02.404 ****** 35374 1726882915.82534: entering _queue_task() for managed_node1/command 35374 1726882915.82688: worker is 1 (out of 1 available) 35374 1726882915.82700: exiting _queue_task() for managed_node1/command 35374 1726882915.82712: done queuing things up, now waiting for results queue to drain 35374 1726882915.82713: waiting for pending results... 35374 1726882915.82847: running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 35374 1726882915.82918: in run() - task 0e448fcc-3ce9-ee6a-9b8c-00000000017b 35374 1726882915.82927: variable 'ansible_search_path' from source: unknown 35374 1726882915.82930: variable 'ansible_search_path' from source: unknown 35374 1726882915.82956: calling self._execute() 35374 1726882915.83011: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882915.83015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882915.83023: variable 'omit' from source: magic vars 35374 1726882915.83274: variable 'ansible_distribution' from source: facts 35374 1726882915.83285: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 35374 1726882915.83368: variable 'ansible_distribution_major_version' from source: facts 35374 1726882915.83376: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 35374 1726882915.83379: when evaluation is False, skipping this task 35374 1726882915.83381: _execute() done 35374 1726882915.83384: dumping result to json 35374 1726882915.83386: done dumping result, returning 35374 1726882915.83392: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 [0e448fcc-3ce9-ee6a-9b8c-00000000017b] 35374 1726882915.83397: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000017b 35374 1726882915.83485: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000017b 35374 1726882915.83488: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 35374 1726882915.83532: no more pending results, returning what we have 35374 1726882915.83535: results queue empty 35374 1726882915.83536: checking for any_errors_fatal 35374 1726882915.83539: done checking for any_errors_fatal 35374 1726882915.83540: checking for max_fail_percentage 35374 1726882915.83541: done checking for max_fail_percentage 35374 1726882915.83542: checking to see if all hosts have failed and the running result is not ok 35374 1726882915.83543: done checking to see if all hosts have failed 35374 1726882915.83544: getting the remaining hosts for this loop 35374 1726882915.83545: done getting the remaining hosts for this loop 35374 1726882915.83548: getting the next task for host managed_node1 35374 1726882915.83555: done getting next task for host managed_node1 35374 1726882915.83558: ^ task is: TASK: Enable EPEL 6 35374 1726882915.83561: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882915.83565: getting variables 35374 1726882915.83566: in VariableManager get_vars() 35374 1726882915.83589: Calling all_inventory to load vars for managed_node1 35374 1726882915.83591: Calling groups_inventory to load vars for managed_node1 35374 1726882915.83593: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882915.83599: Calling all_plugins_play to load vars for managed_node1 35374 1726882915.83600: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882915.83602: Calling groups_plugins_play to load vars for managed_node1 35374 1726882915.83703: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882915.83822: done with get_vars() 35374 1726882915.83829: done getting variables 35374 1726882915.83866: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 21:41:55 -0400 (0:00:00.013) 0:00:02.417 ****** 35374 1726882915.83885: entering _queue_task() for managed_node1/copy 35374 1726882915.84037: worker is 1 (out of 1 available) 35374 1726882915.84048: exiting _queue_task() for managed_node1/copy 35374 1726882915.84058: done queuing things up, now waiting for results queue to drain 35374 1726882915.84060: waiting for pending results... 35374 1726882915.84195: running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 35374 1726882915.84260: in run() - task 0e448fcc-3ce9-ee6a-9b8c-00000000017d 35374 1726882915.84288: variable 'ansible_search_path' from source: unknown 35374 1726882915.84292: variable 'ansible_search_path' from source: unknown 35374 1726882915.84326: calling self._execute() 35374 1726882915.84377: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882915.84381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882915.84389: variable 'omit' from source: magic vars 35374 1726882915.84631: variable 'ansible_distribution' from source: facts 35374 1726882915.84641: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 35374 1726882915.84720: variable 'ansible_distribution_major_version' from source: facts 35374 1726882915.84724: Evaluated conditional (ansible_distribution_major_version == '6'): False 35374 1726882915.84726: when evaluation is False, skipping this task 35374 1726882915.84730: _execute() done 35374 1726882915.84733: dumping result to json 35374 1726882915.84735: done dumping result, returning 35374 1726882915.84741: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 [0e448fcc-3ce9-ee6a-9b8c-00000000017d] 35374 1726882915.84748: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000017d 35374 1726882915.84828: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000017d 35374 1726882915.84831: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 35374 1726882915.84883: no more pending results, returning what we have 35374 1726882915.84886: results queue empty 35374 1726882915.84887: checking for any_errors_fatal 35374 1726882915.84890: done checking for any_errors_fatal 35374 1726882915.84890: checking for max_fail_percentage 35374 1726882915.84892: done checking for max_fail_percentage 35374 1726882915.84892: checking to see if all hosts have failed and the running result is not ok 35374 1726882915.84893: done checking to see if all hosts have failed 35374 1726882915.84894: getting the remaining hosts for this loop 35374 1726882915.84895: done getting the remaining hosts for this loop 35374 1726882915.84898: getting the next task for host managed_node1 35374 1726882915.84904: done getting next task for host managed_node1 35374 1726882915.84906: ^ task is: TASK: Set network provider to 'nm' 35374 1726882915.84907: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882915.84910: getting variables 35374 1726882915.84911: in VariableManager get_vars() 35374 1726882915.84928: Calling all_inventory to load vars for managed_node1 35374 1726882915.84929: Calling groups_inventory to load vars for managed_node1 35374 1726882915.84931: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882915.84937: Calling all_plugins_play to load vars for managed_node1 35374 1726882915.84938: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882915.84946: Calling groups_plugins_play to load vars for managed_node1 35374 1726882915.85080: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882915.85189: done with get_vars() 35374 1726882915.85196: done getting variables 35374 1726882915.85230: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:13 Friday 20 September 2024 21:41:55 -0400 (0:00:00.013) 0:00:02.431 ****** 35374 1726882915.85245: entering _queue_task() for managed_node1/set_fact 35374 1726882915.85502: worker is 1 (out of 1 available) 35374 1726882915.85515: exiting _queue_task() for managed_node1/set_fact 35374 1726882915.85527: done queuing things up, now waiting for results queue to drain 35374 1726882915.85528: waiting for pending results... 35374 1726882915.85762: running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' 35374 1726882915.85854: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000007 35374 1726882915.85876: variable 'ansible_search_path' from source: unknown 35374 1726882915.85918: calling self._execute() 35374 1726882915.85992: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882915.86004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882915.86017: variable 'omit' from source: magic vars 35374 1726882915.86120: variable 'omit' from source: magic vars 35374 1726882915.86154: variable 'omit' from source: magic vars 35374 1726882915.86195: variable 'omit' from source: magic vars 35374 1726882915.86262: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 35374 1726882915.86324: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 35374 1726882915.86372: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 35374 1726882915.86434: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 35374 1726882915.86449: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 35374 1726882915.86482: variable 'inventory_hostname' from source: host vars for 'managed_node1' 35374 1726882915.86491: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882915.86499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882915.86614: Set connection var ansible_shell_type to sh 35374 1726882915.86620: Set connection var ansible_shell_executable to /bin/sh 35374 1726882915.86629: Set connection var ansible_pipelining to False 35374 1726882915.86653: Set connection var ansible_timeout to 10 35374 1726882915.86669: Set connection var ansible_module_compression to ZIP_DEFLATED 35374 1726882915.86678: Set connection var ansible_connection to ssh 35374 1726882915.86699: variable 'ansible_shell_executable' from source: unknown 35374 1726882915.86703: variable 'ansible_connection' from source: unknown 35374 1726882915.86706: variable 'ansible_module_compression' from source: unknown 35374 1726882915.86708: variable 'ansible_shell_type' from source: unknown 35374 1726882915.86710: variable 'ansible_shell_executable' from source: unknown 35374 1726882915.86713: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882915.86715: variable 'ansible_pipelining' from source: unknown 35374 1726882915.86717: variable 'ansible_timeout' from source: unknown 35374 1726882915.86719: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882915.86821: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 35374 1726882915.86835: variable 'omit' from source: magic vars 35374 1726882915.86838: starting attempt loop 35374 1726882915.86841: running the handler 35374 1726882915.86851: handler run complete 35374 1726882915.86860: attempt loop complete, returning result 35374 1726882915.86863: _execute() done 35374 1726882915.86865: dumping result to json 35374 1726882915.86867: done dumping result, returning 35374 1726882915.86878: done running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' [0e448fcc-3ce9-ee6a-9b8c-000000000007] 35374 1726882915.86883: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000007 35374 1726882915.86954: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000007 35374 1726882915.86957: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 35374 1726882915.87028: no more pending results, returning what we have 35374 1726882915.87030: results queue empty 35374 1726882915.87031: checking for any_errors_fatal 35374 1726882915.87036: done checking for any_errors_fatal 35374 1726882915.87036: checking for max_fail_percentage 35374 1726882915.87038: done checking for max_fail_percentage 35374 1726882915.87039: checking to see if all hosts have failed and the running result is not ok 35374 1726882915.87039: done checking to see if all hosts have failed 35374 1726882915.87040: getting the remaining hosts for this loop 35374 1726882915.87041: done getting the remaining hosts for this loop 35374 1726882915.87044: getting the next task for host managed_node1 35374 1726882915.87049: done getting next task for host managed_node1 35374 1726882915.87050: ^ task is: TASK: meta (flush_handlers) 35374 1726882915.87052: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882915.87055: getting variables 35374 1726882915.87056: in VariableManager get_vars() 35374 1726882915.87081: Calling all_inventory to load vars for managed_node1 35374 1726882915.87082: Calling groups_inventory to load vars for managed_node1 35374 1726882915.87084: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882915.87093: Calling all_plugins_play to load vars for managed_node1 35374 1726882915.87095: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882915.87096: Calling groups_plugins_play to load vars for managed_node1 35374 1726882915.87198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882915.87312: done with get_vars() 35374 1726882915.87319: done getting variables 35374 1726882915.87359: in VariableManager get_vars() 35374 1726882915.87367: Calling all_inventory to load vars for managed_node1 35374 1726882915.87369: Calling groups_inventory to load vars for managed_node1 35374 1726882915.87371: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882915.87374: Calling all_plugins_play to load vars for managed_node1 35374 1726882915.87375: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882915.87377: Calling groups_plugins_play to load vars for managed_node1 35374 1726882915.87455: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882915.87583: done with get_vars() 35374 1726882915.87592: done queuing things up, now waiting for results queue to drain 35374 1726882915.87593: results queue empty 35374 1726882915.87594: checking for any_errors_fatal 35374 1726882915.87595: done checking for any_errors_fatal 35374 1726882915.87596: checking for max_fail_percentage 35374 1726882915.87596: done checking for max_fail_percentage 35374 1726882915.87597: checking to see if all hosts have failed and the running result is not ok 35374 1726882915.87597: done checking to see if all hosts have failed 35374 1726882915.87597: getting the remaining hosts for this loop 35374 1726882915.87598: done getting the remaining hosts for this loop 35374 1726882915.87599: getting the next task for host managed_node1 35374 1726882915.87602: done getting next task for host managed_node1 35374 1726882915.87602: ^ task is: TASK: meta (flush_handlers) 35374 1726882915.87603: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882915.87608: getting variables 35374 1726882915.87609: in VariableManager get_vars() 35374 1726882915.87615: Calling all_inventory to load vars for managed_node1 35374 1726882915.87616: Calling groups_inventory to load vars for managed_node1 35374 1726882915.87617: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882915.87620: Calling all_plugins_play to load vars for managed_node1 35374 1726882915.87622: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882915.87623: Calling groups_plugins_play to load vars for managed_node1 35374 1726882915.87704: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882915.87811: done with get_vars() 35374 1726882915.87816: done getting variables 35374 1726882915.87846: in VariableManager get_vars() 35374 1726882915.87852: Calling all_inventory to load vars for managed_node1 35374 1726882915.87853: Calling groups_inventory to load vars for managed_node1 35374 1726882915.87855: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882915.87858: Calling all_plugins_play to load vars for managed_node1 35374 1726882915.87859: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882915.87861: Calling groups_plugins_play to load vars for managed_node1 35374 1726882915.87940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882915.88044: done with get_vars() 35374 1726882915.88053: done queuing things up, now waiting for results queue to drain 35374 1726882915.88054: results queue empty 35374 1726882915.88054: checking for any_errors_fatal 35374 1726882915.88055: done checking for any_errors_fatal 35374 1726882915.88055: checking for max_fail_percentage 35374 1726882915.88056: done checking for max_fail_percentage 35374 1726882915.88056: checking to see if all hosts have failed and the running result is not ok 35374 1726882915.88057: done checking to see if all hosts have failed 35374 1726882915.88057: getting the remaining hosts for this loop 35374 1726882915.88058: done getting the remaining hosts for this loop 35374 1726882915.88060: getting the next task for host managed_node1 35374 1726882915.88062: done getting next task for host managed_node1 35374 1726882915.88064: ^ task is: None 35374 1726882915.88065: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882915.88066: done queuing things up, now waiting for results queue to drain 35374 1726882915.88067: results queue empty 35374 1726882915.88067: checking for any_errors_fatal 35374 1726882915.88068: done checking for any_errors_fatal 35374 1726882915.88069: checking for max_fail_percentage 35374 1726882915.88069: done checking for max_fail_percentage 35374 1726882915.88070: checking to see if all hosts have failed and the running result is not ok 35374 1726882915.88071: done checking to see if all hosts have failed 35374 1726882915.88072: getting the next task for host managed_node1 35374 1726882915.88074: done getting next task for host managed_node1 35374 1726882915.88074: ^ task is: None 35374 1726882915.88075: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882915.88108: in VariableManager get_vars() 35374 1726882915.88129: done with get_vars() 35374 1726882915.88133: in VariableManager get_vars() 35374 1726882915.88143: done with get_vars() 35374 1726882915.88146: variable 'omit' from source: magic vars 35374 1726882915.88169: in VariableManager get_vars() 35374 1726882915.88183: done with get_vars() 35374 1726882915.88197: variable 'omit' from source: magic vars PLAY [Play for testing wireless connection] ************************************ 35374 1726882915.88909: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 35374 1726882915.88935: getting the remaining hosts for this loop 35374 1726882915.88936: done getting the remaining hosts for this loop 35374 1726882915.88939: getting the next task for host managed_node1 35374 1726882915.88941: done getting next task for host managed_node1 35374 1726882915.88943: ^ task is: TASK: Gathering Facts 35374 1726882915.88945: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882915.88946: getting variables 35374 1726882915.88947: in VariableManager get_vars() 35374 1726882915.88965: Calling all_inventory to load vars for managed_node1 35374 1726882915.88967: Calling groups_inventory to load vars for managed_node1 35374 1726882915.88969: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882915.88974: Calling all_plugins_play to load vars for managed_node1 35374 1726882915.88986: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882915.88989: Calling groups_plugins_play to load vars for managed_node1 35374 1726882915.89135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882915.89393: done with get_vars() 35374 1726882915.89401: done getting variables 35374 1726882915.89435: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:3 Friday 20 September 2024 21:41:55 -0400 (0:00:00.042) 0:00:02.473 ****** 35374 1726882915.89456: entering _queue_task() for managed_node1/gather_facts 35374 1726882915.89675: worker is 1 (out of 1 available) 35374 1726882915.89687: exiting _queue_task() for managed_node1/gather_facts 35374 1726882915.89699: done queuing things up, now waiting for results queue to drain 35374 1726882915.89700: waiting for pending results... 35374 1726882915.90601: running TaskExecutor() for managed_node1/TASK: Gathering Facts 35374 1726882915.90705: in run() - task 0e448fcc-3ce9-ee6a-9b8c-0000000001a3 35374 1726882915.90724: variable 'ansible_search_path' from source: unknown 35374 1726882915.90769: calling self._execute() 35374 1726882915.90856: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882915.90873: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882915.90886: variable 'omit' from source: magic vars 35374 1726882915.91266: variable 'ansible_distribution_major_version' from source: facts 35374 1726882915.91286: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882915.91407: variable 'ansible_distribution_major_version' from source: facts 35374 1726882915.91418: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882915.91425: when evaluation is False, skipping this task 35374 1726882915.91432: _execute() done 35374 1726882915.91437: dumping result to json 35374 1726882915.91443: done dumping result, returning 35374 1726882915.91457: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-ee6a-9b8c-0000000001a3] 35374 1726882915.91470: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000001a3 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882915.91606: no more pending results, returning what we have 35374 1726882915.91611: results queue empty 35374 1726882915.91612: checking for any_errors_fatal 35374 1726882915.91613: done checking for any_errors_fatal 35374 1726882915.91614: checking for max_fail_percentage 35374 1726882915.91616: done checking for max_fail_percentage 35374 1726882915.91617: checking to see if all hosts have failed and the running result is not ok 35374 1726882915.91618: done checking to see if all hosts have failed 35374 1726882915.91619: getting the remaining hosts for this loop 35374 1726882915.91621: done getting the remaining hosts for this loop 35374 1726882915.91625: getting the next task for host managed_node1 35374 1726882915.91633: done getting next task for host managed_node1 35374 1726882915.91634: ^ task is: TASK: meta (flush_handlers) 35374 1726882915.91637: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882915.91640: getting variables 35374 1726882915.91642: in VariableManager get_vars() 35374 1726882915.91695: Calling all_inventory to load vars for managed_node1 35374 1726882915.91698: Calling groups_inventory to load vars for managed_node1 35374 1726882915.91700: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882915.91712: Calling all_plugins_play to load vars for managed_node1 35374 1726882915.91715: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882915.91718: Calling groups_plugins_play to load vars for managed_node1 35374 1726882915.91900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882915.92100: done with get_vars() 35374 1726882915.92110: done getting variables 35374 1726882915.92192: in VariableManager get_vars() 35374 1726882915.92209: Calling all_inventory to load vars for managed_node1 35374 1726882915.92211: Calling groups_inventory to load vars for managed_node1 35374 1726882915.92213: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882915.92217: Calling all_plugins_play to load vars for managed_node1 35374 1726882915.92219: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882915.92222: Calling groups_plugins_play to load vars for managed_node1 35374 1726882915.92581: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000001a3 35374 1726882915.92584: WORKER PROCESS EXITING 35374 1726882915.92601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882915.92848: done with get_vars() 35374 1726882915.92859: done queuing things up, now waiting for results queue to drain 35374 1726882915.92861: results queue empty 35374 1726882915.92862: checking for any_errors_fatal 35374 1726882915.92866: done checking for any_errors_fatal 35374 1726882915.92867: checking for max_fail_percentage 35374 1726882915.92868: done checking for max_fail_percentage 35374 1726882915.92869: checking to see if all hosts have failed and the running result is not ok 35374 1726882915.92870: done checking to see if all hosts have failed 35374 1726882915.92870: getting the remaining hosts for this loop 35374 1726882915.92871: done getting the remaining hosts for this loop 35374 1726882915.92873: getting the next task for host managed_node1 35374 1726882915.92877: done getting next task for host managed_node1 35374 1726882915.92879: ^ task is: TASK: INIT: wireless tests 35374 1726882915.92880: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882915.92882: getting variables 35374 1726882915.92883: in VariableManager get_vars() 35374 1726882915.92898: Calling all_inventory to load vars for managed_node1 35374 1726882915.92900: Calling groups_inventory to load vars for managed_node1 35374 1726882915.92902: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882915.92906: Calling all_plugins_play to load vars for managed_node1 35374 1726882915.92909: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882915.92911: Calling groups_plugins_play to load vars for managed_node1 35374 1726882915.93044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882915.93237: done with get_vars() 35374 1726882915.93244: done getting variables 35374 1726882915.93316: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [INIT: wireless tests] **************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:8 Friday 20 September 2024 21:41:55 -0400 (0:00:00.038) 0:00:02.512 ****** 35374 1726882915.93339: entering _queue_task() for managed_node1/debug 35374 1726882915.93341: Creating lock for debug 35374 1726882915.94270: worker is 1 (out of 1 available) 35374 1726882915.94286: exiting _queue_task() for managed_node1/debug 35374 1726882915.94297: done queuing things up, now waiting for results queue to drain 35374 1726882915.94298: waiting for pending results... 35374 1726882915.94521: running TaskExecutor() for managed_node1/TASK: INIT: wireless tests 35374 1726882915.94613: in run() - task 0e448fcc-3ce9-ee6a-9b8c-00000000000b 35374 1726882915.94634: variable 'ansible_search_path' from source: unknown 35374 1726882915.94687: calling self._execute() 35374 1726882915.94765: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882915.94777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882915.94789: variable 'omit' from source: magic vars 35374 1726882915.95136: variable 'ansible_distribution_major_version' from source: facts 35374 1726882915.95156: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882915.95276: variable 'ansible_distribution_major_version' from source: facts 35374 1726882915.95291: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882915.95299: when evaluation is False, skipping this task 35374 1726882915.95306: _execute() done 35374 1726882915.95312: dumping result to json 35374 1726882915.95319: done dumping result, returning 35374 1726882915.95327: done running TaskExecutor() for managed_node1/TASK: INIT: wireless tests [0e448fcc-3ce9-ee6a-9b8c-00000000000b] 35374 1726882915.95337: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000000b skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 35374 1726882915.95469: no more pending results, returning what we have 35374 1726882915.95473: results queue empty 35374 1726882915.95474: checking for any_errors_fatal 35374 1726882915.95477: done checking for any_errors_fatal 35374 1726882915.95478: checking for max_fail_percentage 35374 1726882915.95480: done checking for max_fail_percentage 35374 1726882915.95481: checking to see if all hosts have failed and the running result is not ok 35374 1726882915.95482: done checking to see if all hosts have failed 35374 1726882915.95483: getting the remaining hosts for this loop 35374 1726882915.95484: done getting the remaining hosts for this loop 35374 1726882915.95487: getting the next task for host managed_node1 35374 1726882915.95495: done getting next task for host managed_node1 35374 1726882915.95498: ^ task is: TASK: Include the task 'setup_mock_wifi.yml' 35374 1726882915.95501: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882915.95504: getting variables 35374 1726882915.95506: in VariableManager get_vars() 35374 1726882915.95553: Calling all_inventory to load vars for managed_node1 35374 1726882915.95556: Calling groups_inventory to load vars for managed_node1 35374 1726882915.95559: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882915.95571: Calling all_plugins_play to load vars for managed_node1 35374 1726882915.95574: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882915.95578: Calling groups_plugins_play to load vars for managed_node1 35374 1726882915.95755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882915.95960: done with get_vars() 35374 1726882915.95970: done getting variables TASK [Include the task 'setup_mock_wifi.yml'] ********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:11 Friday 20 September 2024 21:41:55 -0400 (0:00:00.029) 0:00:02.541 ****** 35374 1726882915.96267: entering _queue_task() for managed_node1/include_tasks 35374 1726882915.96278: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000000b 35374 1726882915.96281: WORKER PROCESS EXITING 35374 1726882915.96474: worker is 1 (out of 1 available) 35374 1726882915.96484: exiting _queue_task() for managed_node1/include_tasks 35374 1726882915.96494: done queuing things up, now waiting for results queue to drain 35374 1726882915.96496: waiting for pending results... 35374 1726882915.96709: running TaskExecutor() for managed_node1/TASK: Include the task 'setup_mock_wifi.yml' 35374 1726882915.96796: in run() - task 0e448fcc-3ce9-ee6a-9b8c-00000000000c 35374 1726882915.96814: variable 'ansible_search_path' from source: unknown 35374 1726882915.96855: calling self._execute() 35374 1726882915.96930: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882915.96942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882915.96954: variable 'omit' from source: magic vars 35374 1726882915.97364: variable 'ansible_distribution_major_version' from source: facts 35374 1726882915.97384: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882915.97499: variable 'ansible_distribution_major_version' from source: facts 35374 1726882915.97509: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882915.97516: when evaluation is False, skipping this task 35374 1726882915.97522: _execute() done 35374 1726882915.97528: dumping result to json 35374 1726882915.97534: done dumping result, returning 35374 1726882915.97543: done running TaskExecutor() for managed_node1/TASK: Include the task 'setup_mock_wifi.yml' [0e448fcc-3ce9-ee6a-9b8c-00000000000c] 35374 1726882915.97552: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000000c skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882915.97680: no more pending results, returning what we have 35374 1726882915.97684: results queue empty 35374 1726882915.97685: checking for any_errors_fatal 35374 1726882915.97691: done checking for any_errors_fatal 35374 1726882915.97692: checking for max_fail_percentage 35374 1726882915.97694: done checking for max_fail_percentage 35374 1726882915.97695: checking to see if all hosts have failed and the running result is not ok 35374 1726882915.97696: done checking to see if all hosts have failed 35374 1726882915.97696: getting the remaining hosts for this loop 35374 1726882915.97698: done getting the remaining hosts for this loop 35374 1726882915.97702: getting the next task for host managed_node1 35374 1726882915.97708: done getting next task for host managed_node1 35374 1726882915.97711: ^ task is: TASK: Copy client certs 35374 1726882915.97713: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882915.97716: getting variables 35374 1726882915.97717: in VariableManager get_vars() 35374 1726882915.97761: Calling all_inventory to load vars for managed_node1 35374 1726882915.97766: Calling groups_inventory to load vars for managed_node1 35374 1726882915.97769: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882915.97780: Calling all_plugins_play to load vars for managed_node1 35374 1726882915.97783: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882915.97786: Calling groups_plugins_play to load vars for managed_node1 35374 1726882915.98009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882915.98198: done with get_vars() 35374 1726882915.98207: done getting variables 35374 1726882915.98424: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000000c 35374 1726882915.98427: WORKER PROCESS EXITING 35374 1726882915.98438: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Copy client certs] ******************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:13 Friday 20 September 2024 21:41:55 -0400 (0:00:00.021) 0:00:02.563 ****** 35374 1726882915.98459: entering _queue_task() for managed_node1/copy 35374 1726882915.98649: worker is 1 (out of 1 available) 35374 1726882915.98660: exiting _queue_task() for managed_node1/copy 35374 1726882915.98673: done queuing things up, now waiting for results queue to drain 35374 1726882915.98674: waiting for pending results... 35374 1726882915.98911: running TaskExecutor() for managed_node1/TASK: Copy client certs 35374 1726882915.99028: in run() - task 0e448fcc-3ce9-ee6a-9b8c-00000000000d 35374 1726882915.99047: variable 'ansible_search_path' from source: unknown 35374 1726882915.99280: Loaded config def from plugin (lookup/items) 35374 1726882915.99293: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 35374 1726882915.99361: variable 'omit' from source: magic vars 35374 1726882915.99484: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882915.99499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882915.99513: variable 'omit' from source: magic vars 35374 1726882915.99961: variable 'ansible_distribution_major_version' from source: facts 35374 1726882915.99980: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882916.00101: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.00112: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882916.00120: when evaluation is False, skipping this task 35374 1726882916.00148: variable 'item' from source: unknown 35374 1726882916.00225: variable 'item' from source: unknown skipping: [managed_node1] => (item=client.key) => { "ansible_loop_var": "item", "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "item": "client.key", "skip_reason": "Conditional result was False" } 35374 1726882916.00421: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882916.00434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882916.00448: variable 'omit' from source: magic vars 35374 1726882916.00604: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.00612: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882916.00709: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.00720: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882916.00728: when evaluation is False, skipping this task 35374 1726882916.00758: variable 'item' from source: unknown 35374 1726882916.00829: variable 'item' from source: unknown skipping: [managed_node1] => (item=client.pem) => { "ansible_loop_var": "item", "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "item": "client.pem", "skip_reason": "Conditional result was False" } 35374 1726882916.00958: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882916.00973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882916.00986: variable 'omit' from source: magic vars 35374 1726882916.01137: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.01148: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882916.01257: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.01268: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882916.01275: when evaluation is False, skipping this task 35374 1726882916.01301: variable 'item' from source: unknown 35374 1726882916.01368: variable 'item' from source: unknown skipping: [managed_node1] => (item=cacert.pem) => { "ansible_loop_var": "item", "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "item": "cacert.pem", "skip_reason": "Conditional result was False" } 35374 1726882916.01451: dumping result to json 35374 1726882916.01466: done dumping result, returning 35374 1726882916.01476: done running TaskExecutor() for managed_node1/TASK: Copy client certs [0e448fcc-3ce9-ee6a-9b8c-00000000000d] 35374 1726882916.01486: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000000d 35374 1726882916.01555: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000000d 35374 1726882916.01563: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false } MSG: All items skipped 35374 1726882916.01603: no more pending results, returning what we have 35374 1726882916.01607: results queue empty 35374 1726882916.01608: checking for any_errors_fatal 35374 1726882916.01613: done checking for any_errors_fatal 35374 1726882916.01614: checking for max_fail_percentage 35374 1726882916.01615: done checking for max_fail_percentage 35374 1726882916.01616: checking to see if all hosts have failed and the running result is not ok 35374 1726882916.01617: done checking to see if all hosts have failed 35374 1726882916.01618: getting the remaining hosts for this loop 35374 1726882916.01619: done getting the remaining hosts for this loop 35374 1726882916.01623: getting the next task for host managed_node1 35374 1726882916.01631: done getting next task for host managed_node1 35374 1726882916.01633: ^ task is: TASK: TEST: wireless connection with WPA-PSK 35374 1726882916.01636: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882916.01638: getting variables 35374 1726882916.01640: in VariableManager get_vars() 35374 1726882916.01687: Calling all_inventory to load vars for managed_node1 35374 1726882916.01690: Calling groups_inventory to load vars for managed_node1 35374 1726882916.01693: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882916.01704: Calling all_plugins_play to load vars for managed_node1 35374 1726882916.01706: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882916.01709: Calling groups_plugins_play to load vars for managed_node1 35374 1726882916.01914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882916.02112: done with get_vars() 35374 1726882916.02123: done getting variables 35374 1726882916.02191: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [TEST: wireless connection with WPA-PSK] ********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:24 Friday 20 September 2024 21:41:56 -0400 (0:00:00.037) 0:00:02.600 ****** 35374 1726882916.02221: entering _queue_task() for managed_node1/debug 35374 1726882916.02623: worker is 1 (out of 1 available) 35374 1726882916.02632: exiting _queue_task() for managed_node1/debug 35374 1726882916.02642: done queuing things up, now waiting for results queue to drain 35374 1726882916.02643: waiting for pending results... 35374 1726882916.02865: running TaskExecutor() for managed_node1/TASK: TEST: wireless connection with WPA-PSK 35374 1726882916.02951: in run() - task 0e448fcc-3ce9-ee6a-9b8c-00000000000f 35374 1726882916.02972: variable 'ansible_search_path' from source: unknown 35374 1726882916.03011: calling self._execute() 35374 1726882916.03084: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882916.03097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882916.03109: variable 'omit' from source: magic vars 35374 1726882916.03445: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.03461: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882916.03581: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.03591: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882916.03598: when evaluation is False, skipping this task 35374 1726882916.03604: _execute() done 35374 1726882916.03611: dumping result to json 35374 1726882916.03617: done dumping result, returning 35374 1726882916.03627: done running TaskExecutor() for managed_node1/TASK: TEST: wireless connection with WPA-PSK [0e448fcc-3ce9-ee6a-9b8c-00000000000f] 35374 1726882916.03639: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000000f 35374 1726882916.03730: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000000f 35374 1726882916.03738: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 35374 1726882916.03783: no more pending results, returning what we have 35374 1726882916.03787: results queue empty 35374 1726882916.03788: checking for any_errors_fatal 35374 1726882916.03797: done checking for any_errors_fatal 35374 1726882916.03798: checking for max_fail_percentage 35374 1726882916.03800: done checking for max_fail_percentage 35374 1726882916.03801: checking to see if all hosts have failed and the running result is not ok 35374 1726882916.03802: done checking to see if all hosts have failed 35374 1726882916.03802: getting the remaining hosts for this loop 35374 1726882916.03804: done getting the remaining hosts for this loop 35374 1726882916.03807: getting the next task for host managed_node1 35374 1726882916.03815: done getting next task for host managed_node1 35374 1726882916.03820: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 35374 1726882916.03823: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882916.03839: getting variables 35374 1726882916.03841: in VariableManager get_vars() 35374 1726882916.03892: Calling all_inventory to load vars for managed_node1 35374 1726882916.03895: Calling groups_inventory to load vars for managed_node1 35374 1726882916.03898: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882916.03909: Calling all_plugins_play to load vars for managed_node1 35374 1726882916.03912: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882916.03915: Calling groups_plugins_play to load vars for managed_node1 35374 1726882916.04089: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882916.04297: done with get_vars() 35374 1726882916.04306: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:41:56 -0400 (0:00:00.023) 0:00:02.624 ****** 35374 1726882916.04579: entering _queue_task() for managed_node1/include_tasks 35374 1726882916.04783: worker is 1 (out of 1 available) 35374 1726882916.04794: exiting _queue_task() for managed_node1/include_tasks 35374 1726882916.04805: done queuing things up, now waiting for results queue to drain 35374 1726882916.04806: waiting for pending results... 35374 1726882916.05028: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 35374 1726882916.05151: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000017 35374 1726882916.05216: variable 'ansible_search_path' from source: unknown 35374 1726882916.05224: variable 'ansible_search_path' from source: unknown 35374 1726882916.05267: calling self._execute() 35374 1726882916.05338: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882916.05350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882916.05370: variable 'omit' from source: magic vars 35374 1726882916.05716: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.05734: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882916.05853: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.05867: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882916.05876: when evaluation is False, skipping this task 35374 1726882916.05885: _execute() done 35374 1726882916.05891: dumping result to json 35374 1726882916.05900: done dumping result, returning 35374 1726882916.05914: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-ee6a-9b8c-000000000017] 35374 1726882916.05925: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000017 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882916.06056: no more pending results, returning what we have 35374 1726882916.06061: results queue empty 35374 1726882916.06062: checking for any_errors_fatal 35374 1726882916.06067: done checking for any_errors_fatal 35374 1726882916.06068: checking for max_fail_percentage 35374 1726882916.06070: done checking for max_fail_percentage 35374 1726882916.06071: checking to see if all hosts have failed and the running result is not ok 35374 1726882916.06072: done checking to see if all hosts have failed 35374 1726882916.06072: getting the remaining hosts for this loop 35374 1726882916.06074: done getting the remaining hosts for this loop 35374 1726882916.06078: getting the next task for host managed_node1 35374 1726882916.06085: done getting next task for host managed_node1 35374 1726882916.06090: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 35374 1726882916.06093: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882916.06107: getting variables 35374 1726882916.06110: in VariableManager get_vars() 35374 1726882916.06154: Calling all_inventory to load vars for managed_node1 35374 1726882916.06158: Calling groups_inventory to load vars for managed_node1 35374 1726882916.06160: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882916.06173: Calling all_plugins_play to load vars for managed_node1 35374 1726882916.06176: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882916.06179: Calling groups_plugins_play to load vars for managed_node1 35374 1726882916.06406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882916.06611: done with get_vars() 35374 1726882916.06620: done getting variables 35374 1726882916.06834: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000017 35374 1726882916.06837: WORKER PROCESS EXITING 35374 1726882916.06862: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:41:56 -0400 (0:00:00.023) 0:00:02.647 ****** 35374 1726882916.06893: entering _queue_task() for managed_node1/debug 35374 1726882916.07087: worker is 1 (out of 1 available) 35374 1726882916.07100: exiting _queue_task() for managed_node1/debug 35374 1726882916.07111: done queuing things up, now waiting for results queue to drain 35374 1726882916.07112: waiting for pending results... 35374 1726882916.07324: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 35374 1726882916.07447: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000018 35374 1726882916.07470: variable 'ansible_search_path' from source: unknown 35374 1726882916.07480: variable 'ansible_search_path' from source: unknown 35374 1726882916.07517: calling self._execute() 35374 1726882916.07591: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882916.07602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882916.07614: variable 'omit' from source: magic vars 35374 1726882916.07952: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.07971: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882916.08084: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.08099: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882916.08106: when evaluation is False, skipping this task 35374 1726882916.08112: _execute() done 35374 1726882916.08118: dumping result to json 35374 1726882916.08125: done dumping result, returning 35374 1726882916.08134: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-ee6a-9b8c-000000000018] 35374 1726882916.08144: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000018 skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 35374 1726882916.08273: no more pending results, returning what we have 35374 1726882916.08277: results queue empty 35374 1726882916.08278: checking for any_errors_fatal 35374 1726882916.08283: done checking for any_errors_fatal 35374 1726882916.08284: checking for max_fail_percentage 35374 1726882916.08286: done checking for max_fail_percentage 35374 1726882916.08287: checking to see if all hosts have failed and the running result is not ok 35374 1726882916.08287: done checking to see if all hosts have failed 35374 1726882916.08288: getting the remaining hosts for this loop 35374 1726882916.08289: done getting the remaining hosts for this loop 35374 1726882916.08293: getting the next task for host managed_node1 35374 1726882916.08300: done getting next task for host managed_node1 35374 1726882916.08304: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 35374 1726882916.08307: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882916.08320: getting variables 35374 1726882916.08322: in VariableManager get_vars() 35374 1726882916.08367: Calling all_inventory to load vars for managed_node1 35374 1726882916.08369: Calling groups_inventory to load vars for managed_node1 35374 1726882916.08372: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882916.08382: Calling all_plugins_play to load vars for managed_node1 35374 1726882916.08384: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882916.08387: Calling groups_plugins_play to load vars for managed_node1 35374 1726882916.08559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882916.08767: done with get_vars() 35374 1726882916.08776: done getting variables 35374 1726882916.09047: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 35374 1726882916.09070: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000018 35374 1726882916.09073: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:41:56 -0400 (0:00:00.021) 0:00:02.669 ****** 35374 1726882916.09086: entering _queue_task() for managed_node1/fail 35374 1726882916.09088: Creating lock for fail 35374 1726882916.09288: worker is 1 (out of 1 available) 35374 1726882916.09300: exiting _queue_task() for managed_node1/fail 35374 1726882916.09310: done queuing things up, now waiting for results queue to drain 35374 1726882916.09311: waiting for pending results... 35374 1726882916.09529: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 35374 1726882916.09651: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000019 35374 1726882916.09672: variable 'ansible_search_path' from source: unknown 35374 1726882916.09679: variable 'ansible_search_path' from source: unknown 35374 1726882916.09714: calling self._execute() 35374 1726882916.09785: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882916.09796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882916.09808: variable 'omit' from source: magic vars 35374 1726882916.10137: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.10154: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882916.10272: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.10284: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882916.10292: when evaluation is False, skipping this task 35374 1726882916.10300: _execute() done 35374 1726882916.10306: dumping result to json 35374 1726882916.10313: done dumping result, returning 35374 1726882916.10323: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-ee6a-9b8c-000000000019] 35374 1726882916.10333: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000019 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882916.10459: no more pending results, returning what we have 35374 1726882916.10465: results queue empty 35374 1726882916.10466: checking for any_errors_fatal 35374 1726882916.10471: done checking for any_errors_fatal 35374 1726882916.10472: checking for max_fail_percentage 35374 1726882916.10475: done checking for max_fail_percentage 35374 1726882916.10476: checking to see if all hosts have failed and the running result is not ok 35374 1726882916.10477: done checking to see if all hosts have failed 35374 1726882916.10477: getting the remaining hosts for this loop 35374 1726882916.10479: done getting the remaining hosts for this loop 35374 1726882916.10482: getting the next task for host managed_node1 35374 1726882916.10489: done getting next task for host managed_node1 35374 1726882916.10493: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 35374 1726882916.10496: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882916.10509: getting variables 35374 1726882916.10511: in VariableManager get_vars() 35374 1726882916.10555: Calling all_inventory to load vars for managed_node1 35374 1726882916.10558: Calling groups_inventory to load vars for managed_node1 35374 1726882916.10560: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882916.10572: Calling all_plugins_play to load vars for managed_node1 35374 1726882916.10575: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882916.10578: Calling groups_plugins_play to load vars for managed_node1 35374 1726882916.10793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882916.11000: done with get_vars() 35374 1726882916.11009: done getting variables 35374 1726882916.11561: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000019 35374 1726882916.11566: WORKER PROCESS EXITING 35374 1726882916.11578: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:41:56 -0400 (0:00:00.025) 0:00:02.694 ****** 35374 1726882916.11604: entering _queue_task() for managed_node1/fail 35374 1726882916.11799: worker is 1 (out of 1 available) 35374 1726882916.11813: exiting _queue_task() for managed_node1/fail 35374 1726882916.11825: done queuing things up, now waiting for results queue to drain 35374 1726882916.11827: waiting for pending results... 35374 1726882916.12386: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 35374 1726882916.12952: in run() - task 0e448fcc-3ce9-ee6a-9b8c-00000000001a 35374 1726882916.12995: variable 'ansible_search_path' from source: unknown 35374 1726882916.13006: variable 'ansible_search_path' from source: unknown 35374 1726882916.13058: calling self._execute() 35374 1726882916.13140: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882916.13151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882916.13171: variable 'omit' from source: magic vars 35374 1726882916.13668: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.13690: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882916.13813: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.13823: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882916.13831: when evaluation is False, skipping this task 35374 1726882916.13838: _execute() done 35374 1726882916.13844: dumping result to json 35374 1726882916.13851: done dumping result, returning 35374 1726882916.13861: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-ee6a-9b8c-00000000001a] 35374 1726882916.13875: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000001a 35374 1726882916.13988: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000001a 35374 1726882916.13996: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882916.14043: no more pending results, returning what we have 35374 1726882916.14048: results queue empty 35374 1726882916.14049: checking for any_errors_fatal 35374 1726882916.14055: done checking for any_errors_fatal 35374 1726882916.14056: checking for max_fail_percentage 35374 1726882916.14058: done checking for max_fail_percentage 35374 1726882916.14059: checking to see if all hosts have failed and the running result is not ok 35374 1726882916.14060: done checking to see if all hosts have failed 35374 1726882916.14061: getting the remaining hosts for this loop 35374 1726882916.14062: done getting the remaining hosts for this loop 35374 1726882916.14068: getting the next task for host managed_node1 35374 1726882916.14075: done getting next task for host managed_node1 35374 1726882916.14079: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 35374 1726882916.14082: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882916.14097: getting variables 35374 1726882916.14099: in VariableManager get_vars() 35374 1726882916.14148: Calling all_inventory to load vars for managed_node1 35374 1726882916.14151: Calling groups_inventory to load vars for managed_node1 35374 1726882916.14154: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882916.14167: Calling all_plugins_play to load vars for managed_node1 35374 1726882916.14171: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882916.14174: Calling groups_plugins_play to load vars for managed_node1 35374 1726882916.14340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882916.14537: done with get_vars() 35374 1726882916.14547: done getting variables 35374 1726882916.14620: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:41:56 -0400 (0:00:00.030) 0:00:02.725 ****** 35374 1726882916.14656: entering _queue_task() for managed_node1/fail 35374 1726882916.15278: worker is 1 (out of 1 available) 35374 1726882916.15290: exiting _queue_task() for managed_node1/fail 35374 1726882916.15302: done queuing things up, now waiting for results queue to drain 35374 1726882916.15303: waiting for pending results... 35374 1726882916.17015: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 35374 1726882916.17251: in run() - task 0e448fcc-3ce9-ee6a-9b8c-00000000001b 35374 1726882916.17275: variable 'ansible_search_path' from source: unknown 35374 1726882916.17283: variable 'ansible_search_path' from source: unknown 35374 1726882916.17326: calling self._execute() 35374 1726882916.17498: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882916.17538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882916.17551: variable 'omit' from source: magic vars 35374 1726882916.18249: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.18413: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882916.18536: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.18622: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882916.18631: when evaluation is False, skipping this task 35374 1726882916.18639: _execute() done 35374 1726882916.18648: dumping result to json 35374 1726882916.18656: done dumping result, returning 35374 1726882916.18671: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-ee6a-9b8c-00000000001b] 35374 1726882916.18685: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000001b skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882916.18869: no more pending results, returning what we have 35374 1726882916.18873: results queue empty 35374 1726882916.18874: checking for any_errors_fatal 35374 1726882916.18882: done checking for any_errors_fatal 35374 1726882916.18883: checking for max_fail_percentage 35374 1726882916.18885: done checking for max_fail_percentage 35374 1726882916.18886: checking to see if all hosts have failed and the running result is not ok 35374 1726882916.18887: done checking to see if all hosts have failed 35374 1726882916.18888: getting the remaining hosts for this loop 35374 1726882916.18889: done getting the remaining hosts for this loop 35374 1726882916.18893: getting the next task for host managed_node1 35374 1726882916.18901: done getting next task for host managed_node1 35374 1726882916.18905: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 35374 1726882916.18908: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882916.18927: getting variables 35374 1726882916.18929: in VariableManager get_vars() 35374 1726882916.18984: Calling all_inventory to load vars for managed_node1 35374 1726882916.18987: Calling groups_inventory to load vars for managed_node1 35374 1726882916.18990: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882916.19001: Calling all_plugins_play to load vars for managed_node1 35374 1726882916.19004: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882916.19007: Calling groups_plugins_play to load vars for managed_node1 35374 1726882916.19229: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000001b 35374 1726882916.19232: WORKER PROCESS EXITING 35374 1726882916.19277: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882916.19481: done with get_vars() 35374 1726882916.19491: done getting variables 35374 1726882916.19783: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:41:56 -0400 (0:00:00.051) 0:00:02.776 ****** 35374 1726882916.19812: entering _queue_task() for managed_node1/dnf 35374 1726882916.20041: worker is 1 (out of 1 available) 35374 1726882916.20054: exiting _queue_task() for managed_node1/dnf 35374 1726882916.20269: done queuing things up, now waiting for results queue to drain 35374 1726882916.20271: waiting for pending results... 35374 1726882916.20920: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 35374 1726882916.21153: in run() - task 0e448fcc-3ce9-ee6a-9b8c-00000000001c 35374 1726882916.21179: variable 'ansible_search_path' from source: unknown 35374 1726882916.21188: variable 'ansible_search_path' from source: unknown 35374 1726882916.21225: calling self._execute() 35374 1726882916.21413: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882916.21424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882916.21436: variable 'omit' from source: magic vars 35374 1726882916.22026: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.22047: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882916.22177: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.22188: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882916.22195: when evaluation is False, skipping this task 35374 1726882916.22204: _execute() done 35374 1726882916.22215: dumping result to json 35374 1726882916.22223: done dumping result, returning 35374 1726882916.22234: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-ee6a-9b8c-00000000001c] 35374 1726882916.22246: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000001c skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882916.22404: no more pending results, returning what we have 35374 1726882916.22408: results queue empty 35374 1726882916.22409: checking for any_errors_fatal 35374 1726882916.22414: done checking for any_errors_fatal 35374 1726882916.22414: checking for max_fail_percentage 35374 1726882916.22416: done checking for max_fail_percentage 35374 1726882916.22417: checking to see if all hosts have failed and the running result is not ok 35374 1726882916.22418: done checking to see if all hosts have failed 35374 1726882916.22418: getting the remaining hosts for this loop 35374 1726882916.22420: done getting the remaining hosts for this loop 35374 1726882916.22423: getting the next task for host managed_node1 35374 1726882916.22429: done getting next task for host managed_node1 35374 1726882916.22432: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 35374 1726882916.22435: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882916.22452: getting variables 35374 1726882916.22454: in VariableManager get_vars() 35374 1726882916.22501: Calling all_inventory to load vars for managed_node1 35374 1726882916.22503: Calling groups_inventory to load vars for managed_node1 35374 1726882916.22506: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882916.22517: Calling all_plugins_play to load vars for managed_node1 35374 1726882916.22520: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882916.22523: Calling groups_plugins_play to load vars for managed_node1 35374 1726882916.22669: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000001c 35374 1726882916.22672: WORKER PROCESS EXITING 35374 1726882916.22695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882916.22937: done with get_vars() 35374 1726882916.22948: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 35374 1726882916.23031: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:41:56 -0400 (0:00:00.032) 0:00:02.809 ****** 35374 1726882916.23457: entering _queue_task() for managed_node1/yum 35374 1726882916.23459: Creating lock for yum 35374 1726882916.23735: worker is 1 (out of 1 available) 35374 1726882916.23749: exiting _queue_task() for managed_node1/yum 35374 1726882916.23770: done queuing things up, now waiting for results queue to drain 35374 1726882916.23771: waiting for pending results... 35374 1726882916.24971: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 35374 1726882916.25094: in run() - task 0e448fcc-3ce9-ee6a-9b8c-00000000001d 35374 1726882916.25239: variable 'ansible_search_path' from source: unknown 35374 1726882916.25248: variable 'ansible_search_path' from source: unknown 35374 1726882916.25290: calling self._execute() 35374 1726882916.25475: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882916.25487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882916.25557: variable 'omit' from source: magic vars 35374 1726882916.26225: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.26320: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882916.26484: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.26636: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882916.26644: when evaluation is False, skipping this task 35374 1726882916.26650: _execute() done 35374 1726882916.26657: dumping result to json 35374 1726882916.26665: done dumping result, returning 35374 1726882916.26678: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-ee6a-9b8c-00000000001d] 35374 1726882916.26689: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000001d skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882916.26836: no more pending results, returning what we have 35374 1726882916.26840: results queue empty 35374 1726882916.26841: checking for any_errors_fatal 35374 1726882916.26847: done checking for any_errors_fatal 35374 1726882916.26848: checking for max_fail_percentage 35374 1726882916.26850: done checking for max_fail_percentage 35374 1726882916.26850: checking to see if all hosts have failed and the running result is not ok 35374 1726882916.26851: done checking to see if all hosts have failed 35374 1726882916.26852: getting the remaining hosts for this loop 35374 1726882916.26854: done getting the remaining hosts for this loop 35374 1726882916.26858: getting the next task for host managed_node1 35374 1726882916.26867: done getting next task for host managed_node1 35374 1726882916.26871: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 35374 1726882916.26874: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882916.26888: getting variables 35374 1726882916.26890: in VariableManager get_vars() 35374 1726882916.26940: Calling all_inventory to load vars for managed_node1 35374 1726882916.26943: Calling groups_inventory to load vars for managed_node1 35374 1726882916.26946: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882916.26957: Calling all_plugins_play to load vars for managed_node1 35374 1726882916.26960: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882916.26964: Calling groups_plugins_play to load vars for managed_node1 35374 1726882916.27197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882916.27412: done with get_vars() 35374 1726882916.27423: done getting variables 35374 1726882916.27778: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000001d 35374 1726882916.27781: WORKER PROCESS EXITING 35374 1726882916.27807: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:41:56 -0400 (0:00:00.047) 0:00:02.857 ****** 35374 1726882916.27840: entering _queue_task() for managed_node1/fail 35374 1726882916.28289: worker is 1 (out of 1 available) 35374 1726882916.28300: exiting _queue_task() for managed_node1/fail 35374 1726882916.28312: done queuing things up, now waiting for results queue to drain 35374 1726882916.28313: waiting for pending results... 35374 1726882916.28797: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 35374 1726882916.29082: in run() - task 0e448fcc-3ce9-ee6a-9b8c-00000000001e 35374 1726882916.29096: variable 'ansible_search_path' from source: unknown 35374 1726882916.29099: variable 'ansible_search_path' from source: unknown 35374 1726882916.29246: calling self._execute() 35374 1726882916.29386: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882916.29402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882916.29416: variable 'omit' from source: magic vars 35374 1726882916.30239: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.30331: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882916.30572: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.30598: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882916.30645: when evaluation is False, skipping this task 35374 1726882916.30654: _execute() done 35374 1726882916.30661: dumping result to json 35374 1726882916.30674: done dumping result, returning 35374 1726882916.30688: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-ee6a-9b8c-00000000001e] 35374 1726882916.30705: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000001e skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882916.30911: no more pending results, returning what we have 35374 1726882916.30916: results queue empty 35374 1726882916.30917: checking for any_errors_fatal 35374 1726882916.30922: done checking for any_errors_fatal 35374 1726882916.30923: checking for max_fail_percentage 35374 1726882916.30925: done checking for max_fail_percentage 35374 1726882916.30926: checking to see if all hosts have failed and the running result is not ok 35374 1726882916.30927: done checking to see if all hosts have failed 35374 1726882916.30928: getting the remaining hosts for this loop 35374 1726882916.30929: done getting the remaining hosts for this loop 35374 1726882916.30933: getting the next task for host managed_node1 35374 1726882916.30941: done getting next task for host managed_node1 35374 1726882916.30946: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 35374 1726882916.30949: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882916.30965: getting variables 35374 1726882916.30967: in VariableManager get_vars() 35374 1726882916.31019: Calling all_inventory to load vars for managed_node1 35374 1726882916.31022: Calling groups_inventory to load vars for managed_node1 35374 1726882916.31024: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882916.31037: Calling all_plugins_play to load vars for managed_node1 35374 1726882916.31039: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882916.31043: Calling groups_plugins_play to load vars for managed_node1 35374 1726882916.31216: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000001e 35374 1726882916.31219: WORKER PROCESS EXITING 35374 1726882916.31240: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882916.31458: done with get_vars() 35374 1726882916.31472: done getting variables 35374 1726882916.31590: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:41:56 -0400 (0:00:00.038) 0:00:02.895 ****** 35374 1726882916.31685: entering _queue_task() for managed_node1/package 35374 1726882916.32227: worker is 1 (out of 1 available) 35374 1726882916.32389: exiting _queue_task() for managed_node1/package 35374 1726882916.32401: done queuing things up, now waiting for results queue to drain 35374 1726882916.32402: waiting for pending results... 35374 1726882916.33100: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 35374 1726882916.33396: in run() - task 0e448fcc-3ce9-ee6a-9b8c-00000000001f 35374 1726882916.33501: variable 'ansible_search_path' from source: unknown 35374 1726882916.33512: variable 'ansible_search_path' from source: unknown 35374 1726882916.33544: calling self._execute() 35374 1726882916.33734: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882916.33743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882916.33755: variable 'omit' from source: magic vars 35374 1726882916.34565: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.34591: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882916.35401: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.35634: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882916.35642: when evaluation is False, skipping this task 35374 1726882916.35649: _execute() done 35374 1726882916.35657: dumping result to json 35374 1726882916.35667: done dumping result, returning 35374 1726882916.35682: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-ee6a-9b8c-00000000001f] 35374 1726882916.35694: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000001f skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882916.35859: no more pending results, returning what we have 35374 1726882916.35865: results queue empty 35374 1726882916.35866: checking for any_errors_fatal 35374 1726882916.35873: done checking for any_errors_fatal 35374 1726882916.35873: checking for max_fail_percentage 35374 1726882916.35875: done checking for max_fail_percentage 35374 1726882916.35876: checking to see if all hosts have failed and the running result is not ok 35374 1726882916.35877: done checking to see if all hosts have failed 35374 1726882916.35878: getting the remaining hosts for this loop 35374 1726882916.35880: done getting the remaining hosts for this loop 35374 1726882916.35884: getting the next task for host managed_node1 35374 1726882916.35891: done getting next task for host managed_node1 35374 1726882916.35895: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 35374 1726882916.35897: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882916.35910: getting variables 35374 1726882916.35912: in VariableManager get_vars() 35374 1726882916.35965: Calling all_inventory to load vars for managed_node1 35374 1726882916.35968: Calling groups_inventory to load vars for managed_node1 35374 1726882916.35971: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882916.35983: Calling all_plugins_play to load vars for managed_node1 35374 1726882916.35986: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882916.35988: Calling groups_plugins_play to load vars for managed_node1 35374 1726882916.36388: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000001f 35374 1726882916.36391: WORKER PROCESS EXITING 35374 1726882916.36491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882916.36692: done with get_vars() 35374 1726882916.36813: done getting variables 35374 1726882916.36871: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:41:56 -0400 (0:00:00.052) 0:00:02.947 ****** 35374 1726882916.36902: entering _queue_task() for managed_node1/package 35374 1726882916.37487: worker is 1 (out of 1 available) 35374 1726882916.37500: exiting _queue_task() for managed_node1/package 35374 1726882916.37512: done queuing things up, now waiting for results queue to drain 35374 1726882916.37513: waiting for pending results... 35374 1726882916.38238: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 35374 1726882916.38546: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000020 35374 1726882916.38571: variable 'ansible_search_path' from source: unknown 35374 1726882916.38580: variable 'ansible_search_path' from source: unknown 35374 1726882916.38616: calling self._execute() 35374 1726882916.38718: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882916.38867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882916.38881: variable 'omit' from source: magic vars 35374 1726882916.39580: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.39629: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882916.39861: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.39948: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882916.39955: when evaluation is False, skipping this task 35374 1726882916.39962: _execute() done 35374 1726882916.39974: dumping result to json 35374 1726882916.39983: done dumping result, returning 35374 1726882916.39994: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-ee6a-9b8c-000000000020] 35374 1726882916.40006: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000020 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882916.40157: no more pending results, returning what we have 35374 1726882916.40161: results queue empty 35374 1726882916.40162: checking for any_errors_fatal 35374 1726882916.40169: done checking for any_errors_fatal 35374 1726882916.40170: checking for max_fail_percentage 35374 1726882916.40172: done checking for max_fail_percentage 35374 1726882916.40173: checking to see if all hosts have failed and the running result is not ok 35374 1726882916.40174: done checking to see if all hosts have failed 35374 1726882916.40174: getting the remaining hosts for this loop 35374 1726882916.40176: done getting the remaining hosts for this loop 35374 1726882916.40180: getting the next task for host managed_node1 35374 1726882916.40187: done getting next task for host managed_node1 35374 1726882916.40191: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 35374 1726882916.40194: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882916.40208: getting variables 35374 1726882916.40210: in VariableManager get_vars() 35374 1726882916.40258: Calling all_inventory to load vars for managed_node1 35374 1726882916.40261: Calling groups_inventory to load vars for managed_node1 35374 1726882916.40263: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882916.40278: Calling all_plugins_play to load vars for managed_node1 35374 1726882916.40280: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882916.40283: Calling groups_plugins_play to load vars for managed_node1 35374 1726882916.40449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882916.40648: done with get_vars() 35374 1726882916.40658: done getting variables 35374 1726882916.40691: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000020 35374 1726882916.40694: WORKER PROCESS EXITING 35374 1726882916.40726: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:41:56 -0400 (0:00:00.038) 0:00:02.986 ****** 35374 1726882916.40755: entering _queue_task() for managed_node1/package 35374 1726882916.41297: worker is 1 (out of 1 available) 35374 1726882916.41310: exiting _queue_task() for managed_node1/package 35374 1726882916.41321: done queuing things up, now waiting for results queue to drain 35374 1726882916.41323: waiting for pending results... 35374 1726882916.41998: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 35374 1726882916.42149: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000021 35374 1726882916.42162: variable 'ansible_search_path' from source: unknown 35374 1726882916.42167: variable 'ansible_search_path' from source: unknown 35374 1726882916.42315: calling self._execute() 35374 1726882916.42395: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882916.42515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882916.42523: variable 'omit' from source: magic vars 35374 1726882916.43228: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.43240: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882916.43471: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.43599: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882916.43607: when evaluation is False, skipping this task 35374 1726882916.43614: _execute() done 35374 1726882916.43626: dumping result to json 35374 1726882916.43634: done dumping result, returning 35374 1726882916.43646: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-ee6a-9b8c-000000000021] 35374 1726882916.43657: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000021 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882916.43821: no more pending results, returning what we have 35374 1726882916.43826: results queue empty 35374 1726882916.43827: checking for any_errors_fatal 35374 1726882916.43833: done checking for any_errors_fatal 35374 1726882916.43834: checking for max_fail_percentage 35374 1726882916.43836: done checking for max_fail_percentage 35374 1726882916.43837: checking to see if all hosts have failed and the running result is not ok 35374 1726882916.43838: done checking to see if all hosts have failed 35374 1726882916.43838: getting the remaining hosts for this loop 35374 1726882916.43840: done getting the remaining hosts for this loop 35374 1726882916.43844: getting the next task for host managed_node1 35374 1726882916.43851: done getting next task for host managed_node1 35374 1726882916.43856: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 35374 1726882916.43859: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882916.43889: getting variables 35374 1726882916.43892: in VariableManager get_vars() 35374 1726882916.43938: Calling all_inventory to load vars for managed_node1 35374 1726882916.43941: Calling groups_inventory to load vars for managed_node1 35374 1726882916.43943: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882916.43954: Calling all_plugins_play to load vars for managed_node1 35374 1726882916.43956: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882916.43959: Calling groups_plugins_play to load vars for managed_node1 35374 1726882916.44167: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000021 35374 1726882916.44171: WORKER PROCESS EXITING 35374 1726882916.44195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882916.44417: done with get_vars() 35374 1726882916.44427: done getting variables 35374 1726882916.44589: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:41:56 -0400 (0:00:00.039) 0:00:03.026 ****** 35374 1726882916.44743: entering _queue_task() for managed_node1/service 35374 1726882916.44745: Creating lock for service 35374 1726882916.45395: worker is 1 (out of 1 available) 35374 1726882916.45409: exiting _queue_task() for managed_node1/service 35374 1726882916.45420: done queuing things up, now waiting for results queue to drain 35374 1726882916.45421: waiting for pending results... 35374 1726882916.46223: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 35374 1726882916.46467: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000022 35374 1726882916.46490: variable 'ansible_search_path' from source: unknown 35374 1726882916.46616: variable 'ansible_search_path' from source: unknown 35374 1726882916.46658: calling self._execute() 35374 1726882916.46849: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882916.46859: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882916.46877: variable 'omit' from source: magic vars 35374 1726882916.47620: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.47716: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882916.47949: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.47959: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882916.47971: when evaluation is False, skipping this task 35374 1726882916.47978: _execute() done 35374 1726882916.47984: dumping result to json 35374 1726882916.47991: done dumping result, returning 35374 1726882916.48001: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-ee6a-9b8c-000000000022] 35374 1726882916.48036: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000022 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882916.48205: no more pending results, returning what we have 35374 1726882916.48209: results queue empty 35374 1726882916.48210: checking for any_errors_fatal 35374 1726882916.48218: done checking for any_errors_fatal 35374 1726882916.48219: checking for max_fail_percentage 35374 1726882916.48221: done checking for max_fail_percentage 35374 1726882916.48221: checking to see if all hosts have failed and the running result is not ok 35374 1726882916.48222: done checking to see if all hosts have failed 35374 1726882916.48223: getting the remaining hosts for this loop 35374 1726882916.48225: done getting the remaining hosts for this loop 35374 1726882916.48229: getting the next task for host managed_node1 35374 1726882916.48236: done getting next task for host managed_node1 35374 1726882916.48240: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 35374 1726882916.48243: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882916.48255: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000022 35374 1726882916.48263: WORKER PROCESS EXITING 35374 1726882916.48280: getting variables 35374 1726882916.48282: in VariableManager get_vars() 35374 1726882916.48326: Calling all_inventory to load vars for managed_node1 35374 1726882916.48329: Calling groups_inventory to load vars for managed_node1 35374 1726882916.48331: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882916.48341: Calling all_plugins_play to load vars for managed_node1 35374 1726882916.48343: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882916.48346: Calling groups_plugins_play to load vars for managed_node1 35374 1726882916.48518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882916.48741: done with get_vars() 35374 1726882916.48751: done getting variables 35374 1726882916.48810: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:41:56 -0400 (0:00:00.040) 0:00:03.067 ****** 35374 1726882916.48839: entering _queue_task() for managed_node1/service 35374 1726882916.49456: worker is 1 (out of 1 available) 35374 1726882916.49470: exiting _queue_task() for managed_node1/service 35374 1726882916.49492: done queuing things up, now waiting for results queue to drain 35374 1726882916.49494: waiting for pending results... 35374 1726882916.50350: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 35374 1726882916.50591: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000023 35374 1726882916.50660: variable 'ansible_search_path' from source: unknown 35374 1726882916.50673: variable 'ansible_search_path' from source: unknown 35374 1726882916.50712: calling self._execute() 35374 1726882916.50932: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882916.50942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882916.50954: variable 'omit' from source: magic vars 35374 1726882916.51744: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.51761: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882916.52003: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.52017: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882916.52025: when evaluation is False, skipping this task 35374 1726882916.52032: _execute() done 35374 1726882916.52037: dumping result to json 35374 1726882916.52050: done dumping result, returning 35374 1726882916.52070: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-ee6a-9b8c-000000000023] 35374 1726882916.52083: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000023 35374 1726882916.52261: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000023 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 35374 1726882916.52309: no more pending results, returning what we have 35374 1726882916.52314: results queue empty 35374 1726882916.52314: checking for any_errors_fatal 35374 1726882916.52320: done checking for any_errors_fatal 35374 1726882916.52321: checking for max_fail_percentage 35374 1726882916.52322: done checking for max_fail_percentage 35374 1726882916.52323: checking to see if all hosts have failed and the running result is not ok 35374 1726882916.52324: done checking to see if all hosts have failed 35374 1726882916.52324: getting the remaining hosts for this loop 35374 1726882916.52326: done getting the remaining hosts for this loop 35374 1726882916.52329: getting the next task for host managed_node1 35374 1726882916.52336: done getting next task for host managed_node1 35374 1726882916.52339: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 35374 1726882916.52341: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882916.52355: getting variables 35374 1726882916.52356: in VariableManager get_vars() 35374 1726882916.52404: Calling all_inventory to load vars for managed_node1 35374 1726882916.52407: Calling groups_inventory to load vars for managed_node1 35374 1726882916.52410: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882916.52421: Calling all_plugins_play to load vars for managed_node1 35374 1726882916.52424: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882916.52427: Calling groups_plugins_play to load vars for managed_node1 35374 1726882916.52643: WORKER PROCESS EXITING 35374 1726882916.52679: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882916.52906: done with get_vars() 35374 1726882916.52915: done getting variables 35374 1726882916.53061: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:41:56 -0400 (0:00:00.042) 0:00:03.109 ****** 35374 1726882916.53093: entering _queue_task() for managed_node1/service 35374 1726882916.53354: worker is 1 (out of 1 available) 35374 1726882916.53516: exiting _queue_task() for managed_node1/service 35374 1726882916.53527: done queuing things up, now waiting for results queue to drain 35374 1726882916.53529: waiting for pending results... 35374 1726882916.54281: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 35374 1726882916.54533: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000024 35374 1726882916.54583: variable 'ansible_search_path' from source: unknown 35374 1726882916.54681: variable 'ansible_search_path' from source: unknown 35374 1726882916.54720: calling self._execute() 35374 1726882916.54918: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882916.54929: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882916.54942: variable 'omit' from source: magic vars 35374 1726882916.55697: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.55715: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882916.55997: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.56008: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882916.56015: when evaluation is False, skipping this task 35374 1726882916.56022: _execute() done 35374 1726882916.56029: dumping result to json 35374 1726882916.56037: done dumping result, returning 35374 1726882916.56101: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-ee6a-9b8c-000000000024] 35374 1726882916.56113: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000024 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882916.56262: no more pending results, returning what we have 35374 1726882916.56269: results queue empty 35374 1726882916.56271: checking for any_errors_fatal 35374 1726882916.56276: done checking for any_errors_fatal 35374 1726882916.56278: checking for max_fail_percentage 35374 1726882916.56280: done checking for max_fail_percentage 35374 1726882916.56280: checking to see if all hosts have failed and the running result is not ok 35374 1726882916.56281: done checking to see if all hosts have failed 35374 1726882916.56282: getting the remaining hosts for this loop 35374 1726882916.56284: done getting the remaining hosts for this loop 35374 1726882916.56288: getting the next task for host managed_node1 35374 1726882916.56295: done getting next task for host managed_node1 35374 1726882916.56299: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 35374 1726882916.56301: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882916.56314: getting variables 35374 1726882916.56316: in VariableManager get_vars() 35374 1726882916.56368: Calling all_inventory to load vars for managed_node1 35374 1726882916.56371: Calling groups_inventory to load vars for managed_node1 35374 1726882916.56373: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882916.56387: Calling all_plugins_play to load vars for managed_node1 35374 1726882916.56390: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882916.56393: Calling groups_plugins_play to load vars for managed_node1 35374 1726882916.56555: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000024 35374 1726882916.56559: WORKER PROCESS EXITING 35374 1726882916.56575: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882916.56987: done with get_vars() 35374 1726882916.56996: done getting variables 35374 1726882916.57359: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:41:56 -0400 (0:00:00.042) 0:00:03.152 ****** 35374 1726882916.57507: entering _queue_task() for managed_node1/service 35374 1726882916.57728: worker is 1 (out of 1 available) 35374 1726882916.57741: exiting _queue_task() for managed_node1/service 35374 1726882916.57753: done queuing things up, now waiting for results queue to drain 35374 1726882916.57754: waiting for pending results... 35374 1726882916.58552: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 35374 1726882916.58842: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000025 35374 1726882916.58870: variable 'ansible_search_path' from source: unknown 35374 1726882916.58879: variable 'ansible_search_path' from source: unknown 35374 1726882916.58916: calling self._execute() 35374 1726882916.59002: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882916.59012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882916.59025: variable 'omit' from source: magic vars 35374 1726882916.59395: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.59414: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882916.59541: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.59551: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882916.59558: when evaluation is False, skipping this task 35374 1726882916.59566: _execute() done 35374 1726882916.59576: dumping result to json 35374 1726882916.59585: done dumping result, returning 35374 1726882916.59599: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-ee6a-9b8c-000000000025] 35374 1726882916.59613: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000025 35374 1726882916.59718: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000025 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 35374 1726882916.59762: no more pending results, returning what we have 35374 1726882916.59769: results queue empty 35374 1726882916.59771: checking for any_errors_fatal 35374 1726882916.59778: done checking for any_errors_fatal 35374 1726882916.59779: checking for max_fail_percentage 35374 1726882916.59780: done checking for max_fail_percentage 35374 1726882916.59782: checking to see if all hosts have failed and the running result is not ok 35374 1726882916.59782: done checking to see if all hosts have failed 35374 1726882916.59783: getting the remaining hosts for this loop 35374 1726882916.59785: done getting the remaining hosts for this loop 35374 1726882916.59788: getting the next task for host managed_node1 35374 1726882916.59794: done getting next task for host managed_node1 35374 1726882916.59798: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 35374 1726882916.59800: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882916.59816: getting variables 35374 1726882916.59818: in VariableManager get_vars() 35374 1726882916.59866: Calling all_inventory to load vars for managed_node1 35374 1726882916.59872: Calling groups_inventory to load vars for managed_node1 35374 1726882916.59874: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882916.59886: Calling all_plugins_play to load vars for managed_node1 35374 1726882916.59889: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882916.59892: Calling groups_plugins_play to load vars for managed_node1 35374 1726882916.60132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882916.60361: done with get_vars() 35374 1726882916.60375: done getting variables 35374 1726882916.60509: WORKER PROCESS EXITING 35374 1726882916.60550: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:41:56 -0400 (0:00:00.031) 0:00:03.184 ****** 35374 1726882916.60590: entering _queue_task() for managed_node1/copy 35374 1726882916.60922: worker is 1 (out of 1 available) 35374 1726882916.60933: exiting _queue_task() for managed_node1/copy 35374 1726882916.60950: done queuing things up, now waiting for results queue to drain 35374 1726882916.60952: waiting for pending results... 35374 1726882916.61202: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 35374 1726882916.61329: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000026 35374 1726882916.61348: variable 'ansible_search_path' from source: unknown 35374 1726882916.61356: variable 'ansible_search_path' from source: unknown 35374 1726882916.61407: calling self._execute() 35374 1726882916.61485: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882916.61504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882916.61516: variable 'omit' from source: magic vars 35374 1726882916.61881: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.61900: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882916.62028: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.62050: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882916.62058: when evaluation is False, skipping this task 35374 1726882916.62069: _execute() done 35374 1726882916.62079: dumping result to json 35374 1726882916.62087: done dumping result, returning 35374 1726882916.62098: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-ee6a-9b8c-000000000026] 35374 1726882916.62109: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000026 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882916.62252: no more pending results, returning what we have 35374 1726882916.62256: results queue empty 35374 1726882916.62257: checking for any_errors_fatal 35374 1726882916.62266: done checking for any_errors_fatal 35374 1726882916.62269: checking for max_fail_percentage 35374 1726882916.62271: done checking for max_fail_percentage 35374 1726882916.62272: checking to see if all hosts have failed and the running result is not ok 35374 1726882916.62273: done checking to see if all hosts have failed 35374 1726882916.62273: getting the remaining hosts for this loop 35374 1726882916.62276: done getting the remaining hosts for this loop 35374 1726882916.62280: getting the next task for host managed_node1 35374 1726882916.62287: done getting next task for host managed_node1 35374 1726882916.62291: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 35374 1726882916.62293: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882916.62305: getting variables 35374 1726882916.62307: in VariableManager get_vars() 35374 1726882916.62353: Calling all_inventory to load vars for managed_node1 35374 1726882916.62356: Calling groups_inventory to load vars for managed_node1 35374 1726882916.62358: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882916.62372: Calling all_plugins_play to load vars for managed_node1 35374 1726882916.62376: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882916.62379: Calling groups_plugins_play to load vars for managed_node1 35374 1726882916.62557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882916.62798: done with get_vars() 35374 1726882916.62807: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:41:56 -0400 (0:00:00.023) 0:00:03.207 ****** 35374 1726882916.62893: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 35374 1726882916.62895: Creating lock for fedora.linux_system_roles.network_connections 35374 1726882916.63045: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000026 35374 1726882916.63049: WORKER PROCESS EXITING 35374 1726882916.63313: worker is 1 (out of 1 available) 35374 1726882916.63325: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 35374 1726882916.63335: done queuing things up, now waiting for results queue to drain 35374 1726882916.63337: waiting for pending results... 35374 1726882916.64029: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 35374 1726882916.64279: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000027 35374 1726882916.64346: variable 'ansible_search_path' from source: unknown 35374 1726882916.64354: variable 'ansible_search_path' from source: unknown 35374 1726882916.64397: calling self._execute() 35374 1726882916.64514: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882916.64664: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882916.64683: variable 'omit' from source: magic vars 35374 1726882916.65149: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.65169: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882916.65313: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.65338: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882916.65351: when evaluation is False, skipping this task 35374 1726882916.65368: _execute() done 35374 1726882916.65383: dumping result to json 35374 1726882916.65391: done dumping result, returning 35374 1726882916.65402: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-ee6a-9b8c-000000000027] 35374 1726882916.65423: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000027 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882916.65590: no more pending results, returning what we have 35374 1726882916.65595: results queue empty 35374 1726882916.65596: checking for any_errors_fatal 35374 1726882916.65602: done checking for any_errors_fatal 35374 1726882916.65602: checking for max_fail_percentage 35374 1726882916.65604: done checking for max_fail_percentage 35374 1726882916.65605: checking to see if all hosts have failed and the running result is not ok 35374 1726882916.65605: done checking to see if all hosts have failed 35374 1726882916.65606: getting the remaining hosts for this loop 35374 1726882916.65607: done getting the remaining hosts for this loop 35374 1726882916.65610: getting the next task for host managed_node1 35374 1726882916.65617: done getting next task for host managed_node1 35374 1726882916.65620: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 35374 1726882916.65622: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882916.65637: getting variables 35374 1726882916.65639: in VariableManager get_vars() 35374 1726882916.65687: Calling all_inventory to load vars for managed_node1 35374 1726882916.65690: Calling groups_inventory to load vars for managed_node1 35374 1726882916.65693: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882916.65703: Calling all_plugins_play to load vars for managed_node1 35374 1726882916.65705: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882916.65707: Calling groups_plugins_play to load vars for managed_node1 35374 1726882916.65903: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000027 35374 1726882916.65907: WORKER PROCESS EXITING 35374 1726882916.65929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882916.66140: done with get_vars() 35374 1726882916.66150: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:41:56 -0400 (0:00:00.033) 0:00:03.241 ****** 35374 1726882916.66240: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 35374 1726882916.66242: Creating lock for fedora.linux_system_roles.network_state 35374 1726882916.66500: worker is 1 (out of 1 available) 35374 1726882916.66513: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 35374 1726882916.66531: done queuing things up, now waiting for results queue to drain 35374 1726882916.66533: waiting for pending results... 35374 1726882916.66816: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 35374 1726882916.66950: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000028 35374 1726882916.66981: variable 'ansible_search_path' from source: unknown 35374 1726882916.66989: variable 'ansible_search_path' from source: unknown 35374 1726882916.67029: calling self._execute() 35374 1726882916.67112: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882916.67122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882916.67134: variable 'omit' from source: magic vars 35374 1726882916.67850: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.67873: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882916.68021: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.68057: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882916.68071: when evaluation is False, skipping this task 35374 1726882916.68123: _execute() done 35374 1726882916.68132: dumping result to json 35374 1726882916.68139: done dumping result, returning 35374 1726882916.68148: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-ee6a-9b8c-000000000028] 35374 1726882916.68165: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000028 35374 1726882916.68317: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000028 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882916.68382: no more pending results, returning what we have 35374 1726882916.68387: results queue empty 35374 1726882916.68388: checking for any_errors_fatal 35374 1726882916.68395: done checking for any_errors_fatal 35374 1726882916.68396: checking for max_fail_percentage 35374 1726882916.68397: done checking for max_fail_percentage 35374 1726882916.68398: checking to see if all hosts have failed and the running result is not ok 35374 1726882916.68399: done checking to see if all hosts have failed 35374 1726882916.68400: getting the remaining hosts for this loop 35374 1726882916.68401: done getting the remaining hosts for this loop 35374 1726882916.68404: getting the next task for host managed_node1 35374 1726882916.68409: done getting next task for host managed_node1 35374 1726882916.68413: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 35374 1726882916.68415: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882916.68427: getting variables 35374 1726882916.68428: in VariableManager get_vars() 35374 1726882916.68466: Calling all_inventory to load vars for managed_node1 35374 1726882916.68470: Calling groups_inventory to load vars for managed_node1 35374 1726882916.68472: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882916.68479: Calling all_plugins_play to load vars for managed_node1 35374 1726882916.68482: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882916.68487: Calling groups_plugins_play to load vars for managed_node1 35374 1726882916.68590: WORKER PROCESS EXITING 35374 1726882916.68601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882916.68725: done with get_vars() 35374 1726882916.68732: done getting variables 35374 1726882916.68775: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:41:56 -0400 (0:00:00.025) 0:00:03.266 ****** 35374 1726882916.68798: entering _queue_task() for managed_node1/debug 35374 1726882916.68970: worker is 1 (out of 1 available) 35374 1726882916.68983: exiting _queue_task() for managed_node1/debug 35374 1726882916.68994: done queuing things up, now waiting for results queue to drain 35374 1726882916.68995: waiting for pending results... 35374 1726882916.69143: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 35374 1726882916.69216: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000029 35374 1726882916.69232: variable 'ansible_search_path' from source: unknown 35374 1726882916.69236: variable 'ansible_search_path' from source: unknown 35374 1726882916.69267: calling self._execute() 35374 1726882916.69321: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882916.69332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882916.69340: variable 'omit' from source: magic vars 35374 1726882916.69618: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.69628: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882916.69737: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.69749: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882916.69757: when evaluation is False, skipping this task 35374 1726882916.69767: _execute() done 35374 1726882916.69775: dumping result to json 35374 1726882916.69783: done dumping result, returning 35374 1726882916.69793: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-ee6a-9b8c-000000000029] 35374 1726882916.69804: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000029 35374 1726882916.69895: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000029 35374 1726882916.69904: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 35374 1726882916.69965: no more pending results, returning what we have 35374 1726882916.69972: results queue empty 35374 1726882916.69973: checking for any_errors_fatal 35374 1726882916.69978: done checking for any_errors_fatal 35374 1726882916.69979: checking for max_fail_percentage 35374 1726882916.69981: done checking for max_fail_percentage 35374 1726882916.69982: checking to see if all hosts have failed and the running result is not ok 35374 1726882916.69983: done checking to see if all hosts have failed 35374 1726882916.69983: getting the remaining hosts for this loop 35374 1726882916.69985: done getting the remaining hosts for this loop 35374 1726882916.69988: getting the next task for host managed_node1 35374 1726882916.69994: done getting next task for host managed_node1 35374 1726882916.69997: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 35374 1726882916.70000: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882916.70021: getting variables 35374 1726882916.70023: in VariableManager get_vars() 35374 1726882916.70059: Calling all_inventory to load vars for managed_node1 35374 1726882916.70062: Calling groups_inventory to load vars for managed_node1 35374 1726882916.70071: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882916.70078: Calling all_plugins_play to load vars for managed_node1 35374 1726882916.70081: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882916.70084: Calling groups_plugins_play to load vars for managed_node1 35374 1726882916.70416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882916.70722: done with get_vars() 35374 1726882916.70731: done getting variables 35374 1726882916.70787: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:41:56 -0400 (0:00:00.020) 0:00:03.286 ****** 35374 1726882916.70827: entering _queue_task() for managed_node1/debug 35374 1726882916.71188: worker is 1 (out of 1 available) 35374 1726882916.71201: exiting _queue_task() for managed_node1/debug 35374 1726882916.71212: done queuing things up, now waiting for results queue to drain 35374 1726882916.71214: waiting for pending results... 35374 1726882916.71517: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 35374 1726882916.71666: in run() - task 0e448fcc-3ce9-ee6a-9b8c-00000000002a 35374 1726882916.71697: variable 'ansible_search_path' from source: unknown 35374 1726882916.71712: variable 'ansible_search_path' from source: unknown 35374 1726882916.71750: calling self._execute() 35374 1726882916.71839: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882916.71849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882916.71861: variable 'omit' from source: magic vars 35374 1726882916.72382: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.72392: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882916.72491: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.72495: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882916.72497: when evaluation is False, skipping this task 35374 1726882916.72502: _execute() done 35374 1726882916.72504: dumping result to json 35374 1726882916.72508: done dumping result, returning 35374 1726882916.72515: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-ee6a-9b8c-00000000002a] 35374 1726882916.72521: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000002a 35374 1726882916.72621: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000002a 35374 1726882916.72626: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 35374 1726882916.72669: no more pending results, returning what we have 35374 1726882916.72672: results queue empty 35374 1726882916.72673: checking for any_errors_fatal 35374 1726882916.72678: done checking for any_errors_fatal 35374 1726882916.72679: checking for max_fail_percentage 35374 1726882916.72680: done checking for max_fail_percentage 35374 1726882916.72681: checking to see if all hosts have failed and the running result is not ok 35374 1726882916.72682: done checking to see if all hosts have failed 35374 1726882916.72683: getting the remaining hosts for this loop 35374 1726882916.72684: done getting the remaining hosts for this loop 35374 1726882916.72687: getting the next task for host managed_node1 35374 1726882916.72691: done getting next task for host managed_node1 35374 1726882916.72695: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 35374 1726882916.72697: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882916.72707: getting variables 35374 1726882916.72708: in VariableManager get_vars() 35374 1726882916.72740: Calling all_inventory to load vars for managed_node1 35374 1726882916.72743: Calling groups_inventory to load vars for managed_node1 35374 1726882916.72744: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882916.72750: Calling all_plugins_play to load vars for managed_node1 35374 1726882916.72751: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882916.72753: Calling groups_plugins_play to load vars for managed_node1 35374 1726882916.72855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882916.72979: done with get_vars() 35374 1726882916.72986: done getting variables 35374 1726882916.73023: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:41:56 -0400 (0:00:00.022) 0:00:03.309 ****** 35374 1726882916.73045: entering _queue_task() for managed_node1/debug 35374 1726882916.73229: worker is 1 (out of 1 available) 35374 1726882916.73247: exiting _queue_task() for managed_node1/debug 35374 1726882916.73259: done queuing things up, now waiting for results queue to drain 35374 1726882916.73260: waiting for pending results... 35374 1726882916.73514: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 35374 1726882916.73638: in run() - task 0e448fcc-3ce9-ee6a-9b8c-00000000002b 35374 1726882916.73658: variable 'ansible_search_path' from source: unknown 35374 1726882916.73670: variable 'ansible_search_path' from source: unknown 35374 1726882916.73730: calling self._execute() 35374 1726882916.73818: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882916.73831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882916.73845: variable 'omit' from source: magic vars 35374 1726882916.74253: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.74277: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882916.74461: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.74474: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882916.74505: when evaluation is False, skipping this task 35374 1726882916.74515: _execute() done 35374 1726882916.74522: dumping result to json 35374 1726882916.74535: done dumping result, returning 35374 1726882916.74550: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-ee6a-9b8c-00000000002b] 35374 1726882916.74559: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000002b 35374 1726882916.74703: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000002b 35374 1726882916.74709: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 35374 1726882916.75412: no more pending results, returning what we have 35374 1726882916.75416: results queue empty 35374 1726882916.75416: checking for any_errors_fatal 35374 1726882916.75421: done checking for any_errors_fatal 35374 1726882916.75421: checking for max_fail_percentage 35374 1726882916.75423: done checking for max_fail_percentage 35374 1726882916.75424: checking to see if all hosts have failed and the running result is not ok 35374 1726882916.75425: done checking to see if all hosts have failed 35374 1726882916.75425: getting the remaining hosts for this loop 35374 1726882916.75426: done getting the remaining hosts for this loop 35374 1726882916.75429: getting the next task for host managed_node1 35374 1726882916.75434: done getting next task for host managed_node1 35374 1726882916.75438: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 35374 1726882916.75441: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882916.75460: getting variables 35374 1726882916.75462: in VariableManager get_vars() 35374 1726882916.75514: Calling all_inventory to load vars for managed_node1 35374 1726882916.75516: Calling groups_inventory to load vars for managed_node1 35374 1726882916.75519: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882916.75527: Calling all_plugins_play to load vars for managed_node1 35374 1726882916.75530: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882916.75532: Calling groups_plugins_play to load vars for managed_node1 35374 1726882916.75766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882916.75907: done with get_vars() 35374 1726882916.75914: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:41:56 -0400 (0:00:00.029) 0:00:03.338 ****** 35374 1726882916.75976: entering _queue_task() for managed_node1/ping 35374 1726882916.75977: Creating lock for ping 35374 1726882916.76387: worker is 1 (out of 1 available) 35374 1726882916.76396: exiting _queue_task() for managed_node1/ping 35374 1726882916.76405: done queuing things up, now waiting for results queue to drain 35374 1726882916.76407: waiting for pending results... 35374 1726882916.76454: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 35374 1726882916.76575: in run() - task 0e448fcc-3ce9-ee6a-9b8c-00000000002c 35374 1726882916.76596: variable 'ansible_search_path' from source: unknown 35374 1726882916.76605: variable 'ansible_search_path' from source: unknown 35374 1726882916.76644: calling self._execute() 35374 1726882916.76723: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882916.76734: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882916.76747: variable 'omit' from source: magic vars 35374 1726882916.77104: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.77129: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882916.77368: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.77391: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882916.77395: when evaluation is False, skipping this task 35374 1726882916.77439: _execute() done 35374 1726882916.77448: dumping result to json 35374 1726882916.77460: done dumping result, returning 35374 1726882916.77464: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-ee6a-9b8c-00000000002c] 35374 1726882916.77472: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000002c skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882916.77635: no more pending results, returning what we have 35374 1726882916.77639: results queue empty 35374 1726882916.77640: checking for any_errors_fatal 35374 1726882916.77647: done checking for any_errors_fatal 35374 1726882916.77647: checking for max_fail_percentage 35374 1726882916.77649: done checking for max_fail_percentage 35374 1726882916.77650: checking to see if all hosts have failed and the running result is not ok 35374 1726882916.77651: done checking to see if all hosts have failed 35374 1726882916.77651: getting the remaining hosts for this loop 35374 1726882916.77653: done getting the remaining hosts for this loop 35374 1726882916.77656: getting the next task for host managed_node1 35374 1726882916.77735: done getting next task for host managed_node1 35374 1726882916.77738: ^ task is: TASK: meta (role_complete) 35374 1726882916.77741: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882916.77753: getting variables 35374 1726882916.77754: in VariableManager get_vars() 35374 1726882916.77799: Calling all_inventory to load vars for managed_node1 35374 1726882916.77801: Calling groups_inventory to load vars for managed_node1 35374 1726882916.77804: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882916.77811: Calling all_plugins_play to load vars for managed_node1 35374 1726882916.77814: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882916.77818: Calling groups_plugins_play to load vars for managed_node1 35374 1726882916.77981: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000002c 35374 1726882916.77984: WORKER PROCESS EXITING 35374 1726882916.78006: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882916.78242: done with get_vars() 35374 1726882916.78251: done getting variables 35374 1726882916.78341: done queuing things up, now waiting for results queue to drain 35374 1726882916.78342: results queue empty 35374 1726882916.78343: checking for any_errors_fatal 35374 1726882916.78345: done checking for any_errors_fatal 35374 1726882916.78346: checking for max_fail_percentage 35374 1726882916.78347: done checking for max_fail_percentage 35374 1726882916.78347: checking to see if all hosts have failed and the running result is not ok 35374 1726882916.78348: done checking to see if all hosts have failed 35374 1726882916.78349: getting the remaining hosts for this loop 35374 1726882916.78350: done getting the remaining hosts for this loop 35374 1726882916.78352: getting the next task for host managed_node1 35374 1726882916.78356: done getting next task for host managed_node1 35374 1726882916.78359: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 35374 1726882916.78360: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882916.78372: getting variables 35374 1726882916.78374: in VariableManager get_vars() 35374 1726882916.78391: Calling all_inventory to load vars for managed_node1 35374 1726882916.78393: Calling groups_inventory to load vars for managed_node1 35374 1726882916.78395: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882916.78406: Calling all_plugins_play to load vars for managed_node1 35374 1726882916.78408: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882916.78419: Calling groups_plugins_play to load vars for managed_node1 35374 1726882916.78577: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882916.78730: done with get_vars() 35374 1726882916.78736: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:41:56 -0400 (0:00:00.028) 0:00:03.366 ****** 35374 1726882916.78797: entering _queue_task() for managed_node1/include_tasks 35374 1726882916.78998: worker is 1 (out of 1 available) 35374 1726882916.79011: exiting _queue_task() for managed_node1/include_tasks 35374 1726882916.79022: done queuing things up, now waiting for results queue to drain 35374 1726882916.79023: waiting for pending results... 35374 1726882916.79498: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 35374 1726882916.79626: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000063 35374 1726882916.79646: variable 'ansible_search_path' from source: unknown 35374 1726882916.79654: variable 'ansible_search_path' from source: unknown 35374 1726882916.79710: calling self._execute() 35374 1726882916.79796: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882916.79806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882916.79818: variable 'omit' from source: magic vars 35374 1726882916.80196: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.80214: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882916.80336: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.80348: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882916.80356: when evaluation is False, skipping this task 35374 1726882916.80366: _execute() done 35374 1726882916.80378: dumping result to json 35374 1726882916.80391: done dumping result, returning 35374 1726882916.80401: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-ee6a-9b8c-000000000063] 35374 1726882916.80411: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000063 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882916.80553: no more pending results, returning what we have 35374 1726882916.80561: results queue empty 35374 1726882916.80562: checking for any_errors_fatal 35374 1726882916.80566: done checking for any_errors_fatal 35374 1726882916.80566: checking for max_fail_percentage 35374 1726882916.80570: done checking for max_fail_percentage 35374 1726882916.80571: checking to see if all hosts have failed and the running result is not ok 35374 1726882916.80572: done checking to see if all hosts have failed 35374 1726882916.80572: getting the remaining hosts for this loop 35374 1726882916.80574: done getting the remaining hosts for this loop 35374 1726882916.80578: getting the next task for host managed_node1 35374 1726882916.80585: done getting next task for host managed_node1 35374 1726882916.80589: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 35374 1726882916.80592: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882916.80607: getting variables 35374 1726882916.80609: in VariableManager get_vars() 35374 1726882916.80661: Calling all_inventory to load vars for managed_node1 35374 1726882916.80666: Calling groups_inventory to load vars for managed_node1 35374 1726882916.80671: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882916.80683: Calling all_plugins_play to load vars for managed_node1 35374 1726882916.80686: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882916.80689: Calling groups_plugins_play to load vars for managed_node1 35374 1726882916.80947: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882916.81235: done with get_vars() 35374 1726882916.81242: done getting variables 35374 1726882916.81269: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000063 35374 1726882916.81272: WORKER PROCESS EXITING 35374 1726882916.81295: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:41:56 -0400 (0:00:00.025) 0:00:03.391 ****** 35374 1726882916.81319: entering _queue_task() for managed_node1/debug 35374 1726882916.81491: worker is 1 (out of 1 available) 35374 1726882916.81504: exiting _queue_task() for managed_node1/debug 35374 1726882916.81515: done queuing things up, now waiting for results queue to drain 35374 1726882916.81516: waiting for pending results... 35374 1726882916.81662: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 35374 1726882916.81744: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000064 35374 1726882916.81755: variable 'ansible_search_path' from source: unknown 35374 1726882916.81761: variable 'ansible_search_path' from source: unknown 35374 1726882916.81790: calling self._execute() 35374 1726882916.81849: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882916.81853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882916.81862: variable 'omit' from source: magic vars 35374 1726882916.82129: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.82139: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882916.82218: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.82222: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882916.82225: when evaluation is False, skipping this task 35374 1726882916.82228: _execute() done 35374 1726882916.82231: dumping result to json 35374 1726882916.82233: done dumping result, returning 35374 1726882916.82244: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-ee6a-9b8c-000000000064] 35374 1726882916.82247: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000064 35374 1726882916.82328: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000064 35374 1726882916.82331: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 35374 1726882916.82382: no more pending results, returning what we have 35374 1726882916.82385: results queue empty 35374 1726882916.82432: checking for any_errors_fatal 35374 1726882916.82438: done checking for any_errors_fatal 35374 1726882916.82439: checking for max_fail_percentage 35374 1726882916.82440: done checking for max_fail_percentage 35374 1726882916.82441: checking to see if all hosts have failed and the running result is not ok 35374 1726882916.82442: done checking to see if all hosts have failed 35374 1726882916.82443: getting the remaining hosts for this loop 35374 1726882916.82444: done getting the remaining hosts for this loop 35374 1726882916.82447: getting the next task for host managed_node1 35374 1726882916.82452: done getting next task for host managed_node1 35374 1726882916.82455: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 35374 1726882916.82458: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882916.82474: getting variables 35374 1726882916.82476: in VariableManager get_vars() 35374 1726882916.82513: Calling all_inventory to load vars for managed_node1 35374 1726882916.82516: Calling groups_inventory to load vars for managed_node1 35374 1726882916.82518: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882916.82526: Calling all_plugins_play to load vars for managed_node1 35374 1726882916.82529: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882916.82532: Calling groups_plugins_play to load vars for managed_node1 35374 1726882916.82692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882916.82937: done with get_vars() 35374 1726882916.82945: done getting variables 35374 1726882916.83000: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:41:56 -0400 (0:00:00.017) 0:00:03.408 ****** 35374 1726882916.83029: entering _queue_task() for managed_node1/fail 35374 1726882916.83225: worker is 1 (out of 1 available) 35374 1726882916.83237: exiting _queue_task() for managed_node1/fail 35374 1726882916.83248: done queuing things up, now waiting for results queue to drain 35374 1726882916.83249: waiting for pending results... 35374 1726882916.83492: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 35374 1726882916.83605: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000065 35374 1726882916.83624: variable 'ansible_search_path' from source: unknown 35374 1726882916.83634: variable 'ansible_search_path' from source: unknown 35374 1726882916.83667: calling self._execute() 35374 1726882916.83741: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882916.83754: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882916.83762: variable 'omit' from source: magic vars 35374 1726882916.84018: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.84029: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882916.84112: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.84115: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882916.84118: when evaluation is False, skipping this task 35374 1726882916.84125: _execute() done 35374 1726882916.84127: dumping result to json 35374 1726882916.84130: done dumping result, returning 35374 1726882916.84138: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-ee6a-9b8c-000000000065] 35374 1726882916.84143: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000065 35374 1726882916.84225: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000065 35374 1726882916.84230: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882916.84272: no more pending results, returning what we have 35374 1726882916.84275: results queue empty 35374 1726882916.84276: checking for any_errors_fatal 35374 1726882916.84282: done checking for any_errors_fatal 35374 1726882916.84283: checking for max_fail_percentage 35374 1726882916.84284: done checking for max_fail_percentage 35374 1726882916.84285: checking to see if all hosts have failed and the running result is not ok 35374 1726882916.84286: done checking to see if all hosts have failed 35374 1726882916.84287: getting the remaining hosts for this loop 35374 1726882916.84288: done getting the remaining hosts for this loop 35374 1726882916.84291: getting the next task for host managed_node1 35374 1726882916.84295: done getting next task for host managed_node1 35374 1726882916.84299: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 35374 1726882916.84301: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882916.84313: getting variables 35374 1726882916.84314: in VariableManager get_vars() 35374 1726882916.84342: Calling all_inventory to load vars for managed_node1 35374 1726882916.84348: Calling groups_inventory to load vars for managed_node1 35374 1726882916.84350: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882916.84359: Calling all_plugins_play to load vars for managed_node1 35374 1726882916.84362: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882916.84368: Calling groups_plugins_play to load vars for managed_node1 35374 1726882916.84516: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882916.84710: done with get_vars() 35374 1726882916.84720: done getting variables 35374 1726882916.84776: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:41:56 -0400 (0:00:00.017) 0:00:03.426 ****** 35374 1726882916.84808: entering _queue_task() for managed_node1/fail 35374 1726882916.84999: worker is 1 (out of 1 available) 35374 1726882916.85012: exiting _queue_task() for managed_node1/fail 35374 1726882916.85023: done queuing things up, now waiting for results queue to drain 35374 1726882916.85024: waiting for pending results... 35374 1726882916.85268: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 35374 1726882916.85395: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000066 35374 1726882916.85415: variable 'ansible_search_path' from source: unknown 35374 1726882916.85424: variable 'ansible_search_path' from source: unknown 35374 1726882916.85459: calling self._execute() 35374 1726882916.85532: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882916.85535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882916.85543: variable 'omit' from source: magic vars 35374 1726882916.85800: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.85811: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882916.85893: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.85896: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882916.85899: when evaluation is False, skipping this task 35374 1726882916.85902: _execute() done 35374 1726882916.85905: dumping result to json 35374 1726882916.85907: done dumping result, returning 35374 1726882916.85913: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-ee6a-9b8c-000000000066] 35374 1726882916.85921: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000066 35374 1726882916.86007: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000066 35374 1726882916.86010: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882916.86056: no more pending results, returning what we have 35374 1726882916.86059: results queue empty 35374 1726882916.86060: checking for any_errors_fatal 35374 1726882916.86065: done checking for any_errors_fatal 35374 1726882916.86066: checking for max_fail_percentage 35374 1726882916.86070: done checking for max_fail_percentage 35374 1726882916.86070: checking to see if all hosts have failed and the running result is not ok 35374 1726882916.86071: done checking to see if all hosts have failed 35374 1726882916.86072: getting the remaining hosts for this loop 35374 1726882916.86073: done getting the remaining hosts for this loop 35374 1726882916.86076: getting the next task for host managed_node1 35374 1726882916.86080: done getting next task for host managed_node1 35374 1726882916.86083: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 35374 1726882916.86086: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882916.86098: getting variables 35374 1726882916.86100: in VariableManager get_vars() 35374 1726882916.86128: Calling all_inventory to load vars for managed_node1 35374 1726882916.86129: Calling groups_inventory to load vars for managed_node1 35374 1726882916.86131: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882916.86136: Calling all_plugins_play to load vars for managed_node1 35374 1726882916.86138: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882916.86139: Calling groups_plugins_play to load vars for managed_node1 35374 1726882916.86273: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882916.86390: done with get_vars() 35374 1726882916.86396: done getting variables 35374 1726882916.86435: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:41:56 -0400 (0:00:00.016) 0:00:03.443 ****** 35374 1726882916.86455: entering _queue_task() for managed_node1/fail 35374 1726882916.86606: worker is 1 (out of 1 available) 35374 1726882916.86618: exiting _queue_task() for managed_node1/fail 35374 1726882916.86630: done queuing things up, now waiting for results queue to drain 35374 1726882916.86631: waiting for pending results... 35374 1726882916.86780: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 35374 1726882916.86847: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000067 35374 1726882916.86860: variable 'ansible_search_path' from source: unknown 35374 1726882916.86872: variable 'ansible_search_path' from source: unknown 35374 1726882916.86900: calling self._execute() 35374 1726882916.86950: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882916.86954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882916.86962: variable 'omit' from source: magic vars 35374 1726882916.87210: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.87220: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882916.87313: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.87323: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882916.87330: when evaluation is False, skipping this task 35374 1726882916.87336: _execute() done 35374 1726882916.87342: dumping result to json 35374 1726882916.87348: done dumping result, returning 35374 1726882916.87356: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-ee6a-9b8c-000000000067] 35374 1726882916.87367: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000067 35374 1726882916.87458: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000067 35374 1726882916.87466: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882916.87849: no more pending results, returning what we have 35374 1726882916.87852: results queue empty 35374 1726882916.87853: checking for any_errors_fatal 35374 1726882916.87857: done checking for any_errors_fatal 35374 1726882916.87858: checking for max_fail_percentage 35374 1726882916.87859: done checking for max_fail_percentage 35374 1726882916.87860: checking to see if all hosts have failed and the running result is not ok 35374 1726882916.87861: done checking to see if all hosts have failed 35374 1726882916.87862: getting the remaining hosts for this loop 35374 1726882916.87865: done getting the remaining hosts for this loop 35374 1726882916.87868: getting the next task for host managed_node1 35374 1726882916.87873: done getting next task for host managed_node1 35374 1726882916.87878: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 35374 1726882916.87880: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882916.87893: getting variables 35374 1726882916.87894: in VariableManager get_vars() 35374 1726882916.87930: Calling all_inventory to load vars for managed_node1 35374 1726882916.87933: Calling groups_inventory to load vars for managed_node1 35374 1726882916.87935: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882916.87942: Calling all_plugins_play to load vars for managed_node1 35374 1726882916.87944: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882916.87947: Calling groups_plugins_play to load vars for managed_node1 35374 1726882916.88098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882916.88286: done with get_vars() 35374 1726882916.88296: done getting variables 35374 1726882916.88350: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:41:56 -0400 (0:00:00.019) 0:00:03.462 ****** 35374 1726882916.88382: entering _queue_task() for managed_node1/dnf 35374 1726882916.88586: worker is 1 (out of 1 available) 35374 1726882916.88599: exiting _queue_task() for managed_node1/dnf 35374 1726882916.88611: done queuing things up, now waiting for results queue to drain 35374 1726882916.88612: waiting for pending results... 35374 1726882916.88862: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 35374 1726882916.88990: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000068 35374 1726882916.89010: variable 'ansible_search_path' from source: unknown 35374 1726882916.89018: variable 'ansible_search_path' from source: unknown 35374 1726882916.89060: calling self._execute() 35374 1726882916.89135: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882916.89146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882916.89164: variable 'omit' from source: magic vars 35374 1726882916.89509: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.89526: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882916.89643: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.89654: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882916.89661: when evaluation is False, skipping this task 35374 1726882916.89670: _execute() done 35374 1726882916.89679: dumping result to json 35374 1726882916.89687: done dumping result, returning 35374 1726882916.89698: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-ee6a-9b8c-000000000068] 35374 1726882916.89712: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000068 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882916.89857: no more pending results, returning what we have 35374 1726882916.89861: results queue empty 35374 1726882916.89863: checking for any_errors_fatal 35374 1726882916.89870: done checking for any_errors_fatal 35374 1726882916.89871: checking for max_fail_percentage 35374 1726882916.89873: done checking for max_fail_percentage 35374 1726882916.89874: checking to see if all hosts have failed and the running result is not ok 35374 1726882916.89875: done checking to see if all hosts have failed 35374 1726882916.89875: getting the remaining hosts for this loop 35374 1726882916.89878: done getting the remaining hosts for this loop 35374 1726882916.89881: getting the next task for host managed_node1 35374 1726882916.89889: done getting next task for host managed_node1 35374 1726882916.89893: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 35374 1726882916.89895: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882916.89911: getting variables 35374 1726882916.89913: in VariableManager get_vars() 35374 1726882916.89954: Calling all_inventory to load vars for managed_node1 35374 1726882916.89957: Calling groups_inventory to load vars for managed_node1 35374 1726882916.89959: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882916.89971: Calling all_plugins_play to load vars for managed_node1 35374 1726882916.89974: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882916.89977: Calling groups_plugins_play to load vars for managed_node1 35374 1726882916.90158: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000068 35374 1726882916.90162: WORKER PROCESS EXITING 35374 1726882916.90184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882916.90379: done with get_vars() 35374 1726882916.90388: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 35374 1726882916.90452: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:41:56 -0400 (0:00:00.020) 0:00:03.483 ****** 35374 1726882916.90482: entering _queue_task() for managed_node1/yum 35374 1726882916.90680: worker is 1 (out of 1 available) 35374 1726882916.90694: exiting _queue_task() for managed_node1/yum 35374 1726882916.90706: done queuing things up, now waiting for results queue to drain 35374 1726882916.90708: waiting for pending results... 35374 1726882916.90957: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 35374 1726882916.91046: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000069 35374 1726882916.91063: variable 'ansible_search_path' from source: unknown 35374 1726882916.91069: variable 'ansible_search_path' from source: unknown 35374 1726882916.91096: calling self._execute() 35374 1726882916.91153: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882916.91158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882916.91167: variable 'omit' from source: magic vars 35374 1726882916.91420: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.91430: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882916.91510: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.91514: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882916.91516: when evaluation is False, skipping this task 35374 1726882916.91520: _execute() done 35374 1726882916.91523: dumping result to json 35374 1726882916.91526: done dumping result, returning 35374 1726882916.91533: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-ee6a-9b8c-000000000069] 35374 1726882916.91538: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000069 35374 1726882916.91621: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000069 35374 1726882916.91624: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882916.91673: no more pending results, returning what we have 35374 1726882916.91676: results queue empty 35374 1726882916.91677: checking for any_errors_fatal 35374 1726882916.91681: done checking for any_errors_fatal 35374 1726882916.91682: checking for max_fail_percentage 35374 1726882916.91683: done checking for max_fail_percentage 35374 1726882916.91684: checking to see if all hosts have failed and the running result is not ok 35374 1726882916.91684: done checking to see if all hosts have failed 35374 1726882916.91685: getting the remaining hosts for this loop 35374 1726882916.91686: done getting the remaining hosts for this loop 35374 1726882916.91689: getting the next task for host managed_node1 35374 1726882916.91692: done getting next task for host managed_node1 35374 1726882916.91695: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 35374 1726882916.91697: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882916.91708: getting variables 35374 1726882916.91709: in VariableManager get_vars() 35374 1726882916.91744: Calling all_inventory to load vars for managed_node1 35374 1726882916.91746: Calling groups_inventory to load vars for managed_node1 35374 1726882916.91748: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882916.91754: Calling all_plugins_play to load vars for managed_node1 35374 1726882916.91755: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882916.91757: Calling groups_plugins_play to load vars for managed_node1 35374 1726882916.91855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882916.91975: done with get_vars() 35374 1726882916.91982: done getting variables 35374 1726882916.92018: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:41:56 -0400 (0:00:00.015) 0:00:03.499 ****** 35374 1726882916.92038: entering _queue_task() for managed_node1/fail 35374 1726882916.92189: worker is 1 (out of 1 available) 35374 1726882916.92202: exiting _queue_task() for managed_node1/fail 35374 1726882916.92213: done queuing things up, now waiting for results queue to drain 35374 1726882916.92214: waiting for pending results... 35374 1726882916.92358: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 35374 1726882916.92434: in run() - task 0e448fcc-3ce9-ee6a-9b8c-00000000006a 35374 1726882916.92444: variable 'ansible_search_path' from source: unknown 35374 1726882916.92447: variable 'ansible_search_path' from source: unknown 35374 1726882916.92476: calling self._execute() 35374 1726882916.92525: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882916.92529: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882916.92538: variable 'omit' from source: magic vars 35374 1726882916.92820: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.92836: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882916.92950: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.92961: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882916.92975: when evaluation is False, skipping this task 35374 1726882916.92984: _execute() done 35374 1726882916.92992: dumping result to json 35374 1726882916.92999: done dumping result, returning 35374 1726882916.93009: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-ee6a-9b8c-00000000006a] 35374 1726882916.93021: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000006a skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882916.93159: no more pending results, returning what we have 35374 1726882916.93166: results queue empty 35374 1726882916.93168: checking for any_errors_fatal 35374 1726882916.93173: done checking for any_errors_fatal 35374 1726882916.93174: checking for max_fail_percentage 35374 1726882916.93176: done checking for max_fail_percentage 35374 1726882916.93176: checking to see if all hosts have failed and the running result is not ok 35374 1726882916.93177: done checking to see if all hosts have failed 35374 1726882916.93178: getting the remaining hosts for this loop 35374 1726882916.93179: done getting the remaining hosts for this loop 35374 1726882916.93183: getting the next task for host managed_node1 35374 1726882916.93189: done getting next task for host managed_node1 35374 1726882916.93192: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 35374 1726882916.93195: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882916.93214: getting variables 35374 1726882916.93216: in VariableManager get_vars() 35374 1726882916.93256: Calling all_inventory to load vars for managed_node1 35374 1726882916.93258: Calling groups_inventory to load vars for managed_node1 35374 1726882916.93260: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882916.93270: Calling all_plugins_play to load vars for managed_node1 35374 1726882916.93272: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882916.93276: Calling groups_plugins_play to load vars for managed_node1 35374 1726882916.93285: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000006a 35374 1726882916.93288: WORKER PROCESS EXITING 35374 1726882916.93488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882916.93695: done with get_vars() 35374 1726882916.93704: done getting variables 35374 1726882916.93757: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:41:56 -0400 (0:00:00.017) 0:00:03.516 ****** 35374 1726882916.93789: entering _queue_task() for managed_node1/package 35374 1726882916.93970: worker is 1 (out of 1 available) 35374 1726882916.93984: exiting _queue_task() for managed_node1/package 35374 1726882916.93995: done queuing things up, now waiting for results queue to drain 35374 1726882916.93997: waiting for pending results... 35374 1726882916.94162: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 35374 1726882916.94234: in run() - task 0e448fcc-3ce9-ee6a-9b8c-00000000006b 35374 1726882916.94244: variable 'ansible_search_path' from source: unknown 35374 1726882916.94247: variable 'ansible_search_path' from source: unknown 35374 1726882916.94283: calling self._execute() 35374 1726882916.94328: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882916.94332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882916.94340: variable 'omit' from source: magic vars 35374 1726882916.94620: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.94638: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882916.94875: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.94887: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882916.94895: when evaluation is False, skipping this task 35374 1726882916.94902: _execute() done 35374 1726882916.94909: dumping result to json 35374 1726882916.94916: done dumping result, returning 35374 1726882916.94926: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-ee6a-9b8c-00000000006b] 35374 1726882916.94936: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000006b skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882916.95081: no more pending results, returning what we have 35374 1726882916.95087: results queue empty 35374 1726882916.95088: checking for any_errors_fatal 35374 1726882916.95092: done checking for any_errors_fatal 35374 1726882916.95093: checking for max_fail_percentage 35374 1726882916.95095: done checking for max_fail_percentage 35374 1726882916.95096: checking to see if all hosts have failed and the running result is not ok 35374 1726882916.95096: done checking to see if all hosts have failed 35374 1726882916.95097: getting the remaining hosts for this loop 35374 1726882916.95099: done getting the remaining hosts for this loop 35374 1726882916.95102: getting the next task for host managed_node1 35374 1726882916.95110: done getting next task for host managed_node1 35374 1726882916.95113: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 35374 1726882916.95116: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882916.95130: getting variables 35374 1726882916.95132: in VariableManager get_vars() 35374 1726882916.95181: Calling all_inventory to load vars for managed_node1 35374 1726882916.95184: Calling groups_inventory to load vars for managed_node1 35374 1726882916.95187: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882916.95198: Calling all_plugins_play to load vars for managed_node1 35374 1726882916.95201: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882916.95204: Calling groups_plugins_play to load vars for managed_node1 35374 1726882916.95381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882916.95591: done with get_vars() 35374 1726882916.95601: done getting variables 35374 1726882916.95666: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000006b 35374 1726882916.95684: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 35374 1726882916.95709: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:41:56 -0400 (0:00:00.019) 0:00:03.535 ****** 35374 1726882916.95725: entering _queue_task() for managed_node1/package 35374 1726882916.96444: worker is 1 (out of 1 available) 35374 1726882916.96455: exiting _queue_task() for managed_node1/package 35374 1726882916.96468: done queuing things up, now waiting for results queue to drain 35374 1726882916.96469: waiting for pending results... 35374 1726882916.96797: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 35374 1726882916.96937: in run() - task 0e448fcc-3ce9-ee6a-9b8c-00000000006c 35374 1726882916.96957: variable 'ansible_search_path' from source: unknown 35374 1726882916.96967: variable 'ansible_search_path' from source: unknown 35374 1726882916.97003: calling self._execute() 35374 1726882916.97092: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882916.97103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882916.97115: variable 'omit' from source: magic vars 35374 1726882916.97500: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.97517: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882916.97640: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.97651: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882916.97658: when evaluation is False, skipping this task 35374 1726882916.97667: _execute() done 35374 1726882916.97675: dumping result to json 35374 1726882916.97691: done dumping result, returning 35374 1726882916.97706: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-ee6a-9b8c-00000000006c] 35374 1726882916.97717: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000006c skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882916.97866: no more pending results, returning what we have 35374 1726882916.97880: results queue empty 35374 1726882916.97882: checking for any_errors_fatal 35374 1726882916.97887: done checking for any_errors_fatal 35374 1726882916.97888: checking for max_fail_percentage 35374 1726882916.97890: done checking for max_fail_percentage 35374 1726882916.97891: checking to see if all hosts have failed and the running result is not ok 35374 1726882916.97892: done checking to see if all hosts have failed 35374 1726882916.97892: getting the remaining hosts for this loop 35374 1726882916.97894: done getting the remaining hosts for this loop 35374 1726882916.97898: getting the next task for host managed_node1 35374 1726882916.97905: done getting next task for host managed_node1 35374 1726882916.97909: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 35374 1726882916.97912: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882916.97927: getting variables 35374 1726882916.97929: in VariableManager get_vars() 35374 1726882916.97979: Calling all_inventory to load vars for managed_node1 35374 1726882916.97983: Calling groups_inventory to load vars for managed_node1 35374 1726882916.97985: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882916.97997: Calling all_plugins_play to load vars for managed_node1 35374 1726882916.98000: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882916.98003: Calling groups_plugins_play to load vars for managed_node1 35374 1726882916.98541: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000006c 35374 1726882916.98548: WORKER PROCESS EXITING 35374 1726882916.98582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882916.98837: done with get_vars() 35374 1726882916.98843: done getting variables 35374 1726882916.98887: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:41:56 -0400 (0:00:00.031) 0:00:03.567 ****** 35374 1726882916.98907: entering _queue_task() for managed_node1/package 35374 1726882916.99058: worker is 1 (out of 1 available) 35374 1726882916.99073: exiting _queue_task() for managed_node1/package 35374 1726882916.99084: done queuing things up, now waiting for results queue to drain 35374 1726882916.99085: waiting for pending results... 35374 1726882916.99238: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 35374 1726882916.99322: in run() - task 0e448fcc-3ce9-ee6a-9b8c-00000000006d 35374 1726882916.99332: variable 'ansible_search_path' from source: unknown 35374 1726882916.99336: variable 'ansible_search_path' from source: unknown 35374 1726882916.99365: calling self._execute() 35374 1726882916.99427: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882916.99431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882916.99439: variable 'omit' from source: magic vars 35374 1726882916.99701: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.99712: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882916.99793: variable 'ansible_distribution_major_version' from source: facts 35374 1726882916.99797: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882916.99801: when evaluation is False, skipping this task 35374 1726882916.99803: _execute() done 35374 1726882916.99806: dumping result to json 35374 1726882916.99811: done dumping result, returning 35374 1726882916.99817: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-ee6a-9b8c-00000000006d] 35374 1726882916.99822: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000006d 35374 1726882916.99909: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000006d 35374 1726882916.99913: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882916.99954: no more pending results, returning what we have 35374 1726882916.99958: results queue empty 35374 1726882916.99959: checking for any_errors_fatal 35374 1726882916.99966: done checking for any_errors_fatal 35374 1726882916.99967: checking for max_fail_percentage 35374 1726882916.99970: done checking for max_fail_percentage 35374 1726882916.99971: checking to see if all hosts have failed and the running result is not ok 35374 1726882916.99972: done checking to see if all hosts have failed 35374 1726882916.99973: getting the remaining hosts for this loop 35374 1726882916.99974: done getting the remaining hosts for this loop 35374 1726882916.99977: getting the next task for host managed_node1 35374 1726882916.99981: done getting next task for host managed_node1 35374 1726882916.99984: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 35374 1726882916.99987: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882917.00000: getting variables 35374 1726882917.00001: in VariableManager get_vars() 35374 1726882917.00031: Calling all_inventory to load vars for managed_node1 35374 1726882917.00033: Calling groups_inventory to load vars for managed_node1 35374 1726882917.00034: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.00040: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.00041: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.00043: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.00145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.00275: done with get_vars() 35374 1726882917.00284: done getting variables 35374 1726882917.00320: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:41:57 -0400 (0:00:00.014) 0:00:03.582 ****** 35374 1726882917.00347: entering _queue_task() for managed_node1/service 35374 1726882917.00521: worker is 1 (out of 1 available) 35374 1726882917.00536: exiting _queue_task() for managed_node1/service 35374 1726882917.00547: done queuing things up, now waiting for results queue to drain 35374 1726882917.00549: waiting for pending results... 35374 1726882917.01012: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 35374 1726882917.01019: in run() - task 0e448fcc-3ce9-ee6a-9b8c-00000000006e 35374 1726882917.01022: variable 'ansible_search_path' from source: unknown 35374 1726882917.01024: variable 'ansible_search_path' from source: unknown 35374 1726882917.01028: calling self._execute() 35374 1726882917.01030: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.01033: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.01035: variable 'omit' from source: magic vars 35374 1726882917.01424: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.01441: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.01579: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.01605: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.01608: when evaluation is False, skipping this task 35374 1726882917.01613: _execute() done 35374 1726882917.01617: dumping result to json 35374 1726882917.01619: done dumping result, returning 35374 1726882917.01628: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-ee6a-9b8c-00000000006e] 35374 1726882917.01633: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000006e 35374 1726882917.01720: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000006e 35374 1726882917.01723: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882917.01767: no more pending results, returning what we have 35374 1726882917.01771: results queue empty 35374 1726882917.01771: checking for any_errors_fatal 35374 1726882917.01775: done checking for any_errors_fatal 35374 1726882917.01776: checking for max_fail_percentage 35374 1726882917.01778: done checking for max_fail_percentage 35374 1726882917.01778: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.01779: done checking to see if all hosts have failed 35374 1726882917.01780: getting the remaining hosts for this loop 35374 1726882917.01781: done getting the remaining hosts for this loop 35374 1726882917.01784: getting the next task for host managed_node1 35374 1726882917.01789: done getting next task for host managed_node1 35374 1726882917.01793: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 35374 1726882917.01795: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882917.01808: getting variables 35374 1726882917.01810: in VariableManager get_vars() 35374 1726882917.01858: Calling all_inventory to load vars for managed_node1 35374 1726882917.01861: Calling groups_inventory to load vars for managed_node1 35374 1726882917.01862: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.01871: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.01873: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.01875: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.02029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.02228: done with get_vars() 35374 1726882917.02237: done getting variables 35374 1726882917.02293: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:41:57 -0400 (0:00:00.019) 0:00:03.601 ****** 35374 1726882917.02322: entering _queue_task() for managed_node1/service 35374 1726882917.02524: worker is 1 (out of 1 available) 35374 1726882917.02535: exiting _queue_task() for managed_node1/service 35374 1726882917.02545: done queuing things up, now waiting for results queue to drain 35374 1726882917.02547: waiting for pending results... 35374 1726882917.02790: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 35374 1726882917.02910: in run() - task 0e448fcc-3ce9-ee6a-9b8c-00000000006f 35374 1726882917.02930: variable 'ansible_search_path' from source: unknown 35374 1726882917.02938: variable 'ansible_search_path' from source: unknown 35374 1726882917.02981: calling self._execute() 35374 1726882917.03066: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.03080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.03098: variable 'omit' from source: magic vars 35374 1726882917.03406: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.03411: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.03488: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.03492: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.03495: when evaluation is False, skipping this task 35374 1726882917.03498: _execute() done 35374 1726882917.03501: dumping result to json 35374 1726882917.03503: done dumping result, returning 35374 1726882917.03512: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-ee6a-9b8c-00000000006f] 35374 1726882917.03517: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000006f 35374 1726882917.03599: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000006f 35374 1726882917.03602: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 35374 1726882917.03665: no more pending results, returning what we have 35374 1726882917.03668: results queue empty 35374 1726882917.03669: checking for any_errors_fatal 35374 1726882917.03672: done checking for any_errors_fatal 35374 1726882917.03672: checking for max_fail_percentage 35374 1726882917.03674: done checking for max_fail_percentage 35374 1726882917.03674: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.03675: done checking to see if all hosts have failed 35374 1726882917.03676: getting the remaining hosts for this loop 35374 1726882917.03677: done getting the remaining hosts for this loop 35374 1726882917.03679: getting the next task for host managed_node1 35374 1726882917.03682: done getting next task for host managed_node1 35374 1726882917.03685: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 35374 1726882917.03686: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882917.03695: getting variables 35374 1726882917.03696: in VariableManager get_vars() 35374 1726882917.03727: Calling all_inventory to load vars for managed_node1 35374 1726882917.03729: Calling groups_inventory to load vars for managed_node1 35374 1726882917.03732: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.03738: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.03739: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.03741: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.03852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.03983: done with get_vars() 35374 1726882917.03990: done getting variables 35374 1726882917.04026: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:41:57 -0400 (0:00:00.017) 0:00:03.619 ****** 35374 1726882917.04048: entering _queue_task() for managed_node1/service 35374 1726882917.04200: worker is 1 (out of 1 available) 35374 1726882917.04213: exiting _queue_task() for managed_node1/service 35374 1726882917.04225: done queuing things up, now waiting for results queue to drain 35374 1726882917.04226: waiting for pending results... 35374 1726882917.04369: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 35374 1726882917.04447: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000070 35374 1726882917.04456: variable 'ansible_search_path' from source: unknown 35374 1726882917.04460: variable 'ansible_search_path' from source: unknown 35374 1726882917.04493: calling self._execute() 35374 1726882917.04547: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.04550: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.04559: variable 'omit' from source: magic vars 35374 1726882917.04809: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.04814: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.04890: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.04897: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.04899: when evaluation is False, skipping this task 35374 1726882917.04902: _execute() done 35374 1726882917.04904: dumping result to json 35374 1726882917.04907: done dumping result, returning 35374 1726882917.04914: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-ee6a-9b8c-000000000070] 35374 1726882917.04924: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000070 35374 1726882917.05023: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000070 35374 1726882917.05030: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882917.05078: no more pending results, returning what we have 35374 1726882917.05082: results queue empty 35374 1726882917.05083: checking for any_errors_fatal 35374 1726882917.05087: done checking for any_errors_fatal 35374 1726882917.05088: checking for max_fail_percentage 35374 1726882917.05089: done checking for max_fail_percentage 35374 1726882917.05090: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.05091: done checking to see if all hosts have failed 35374 1726882917.05091: getting the remaining hosts for this loop 35374 1726882917.05093: done getting the remaining hosts for this loop 35374 1726882917.05096: getting the next task for host managed_node1 35374 1726882917.05102: done getting next task for host managed_node1 35374 1726882917.05106: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 35374 1726882917.05108: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882917.05123: getting variables 35374 1726882917.05124: in VariableManager get_vars() 35374 1726882917.05213: Calling all_inventory to load vars for managed_node1 35374 1726882917.05215: Calling groups_inventory to load vars for managed_node1 35374 1726882917.05218: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.05225: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.05228: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.05231: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.05444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.05645: done with get_vars() 35374 1726882917.05653: done getting variables 35374 1726882917.05707: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:41:57 -0400 (0:00:00.016) 0:00:03.635 ****** 35374 1726882917.05732: entering _queue_task() for managed_node1/service 35374 1726882917.05927: worker is 1 (out of 1 available) 35374 1726882917.05939: exiting _queue_task() for managed_node1/service 35374 1726882917.05949: done queuing things up, now waiting for results queue to drain 35374 1726882917.05950: waiting for pending results... 35374 1726882917.06175: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 35374 1726882917.06243: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000071 35374 1726882917.06254: variable 'ansible_search_path' from source: unknown 35374 1726882917.06261: variable 'ansible_search_path' from source: unknown 35374 1726882917.06295: calling self._execute() 35374 1726882917.06359: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.06375: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.06382: variable 'omit' from source: magic vars 35374 1726882917.06625: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.06636: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.06714: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.06720: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.06723: when evaluation is False, skipping this task 35374 1726882917.06726: _execute() done 35374 1726882917.06728: dumping result to json 35374 1726882917.06731: done dumping result, returning 35374 1726882917.06737: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-ee6a-9b8c-000000000071] 35374 1726882917.06743: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000071 35374 1726882917.06823: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000071 35374 1726882917.06826: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 35374 1726882917.06875: no more pending results, returning what we have 35374 1726882917.06878: results queue empty 35374 1726882917.06879: checking for any_errors_fatal 35374 1726882917.06883: done checking for any_errors_fatal 35374 1726882917.06884: checking for max_fail_percentage 35374 1726882917.06886: done checking for max_fail_percentage 35374 1726882917.06886: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.06887: done checking to see if all hosts have failed 35374 1726882917.06888: getting the remaining hosts for this loop 35374 1726882917.06889: done getting the remaining hosts for this loop 35374 1726882917.06892: getting the next task for host managed_node1 35374 1726882917.06896: done getting next task for host managed_node1 35374 1726882917.06899: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 35374 1726882917.06902: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882917.06911: getting variables 35374 1726882917.06912: in VariableManager get_vars() 35374 1726882917.06942: Calling all_inventory to load vars for managed_node1 35374 1726882917.06944: Calling groups_inventory to load vars for managed_node1 35374 1726882917.06946: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.06952: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.06953: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.06955: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.07054: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.07176: done with get_vars() 35374 1726882917.07183: done getting variables 35374 1726882917.07218: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:41:57 -0400 (0:00:00.015) 0:00:03.651 ****** 35374 1726882917.07238: entering _queue_task() for managed_node1/copy 35374 1726882917.07388: worker is 1 (out of 1 available) 35374 1726882917.07401: exiting _queue_task() for managed_node1/copy 35374 1726882917.07413: done queuing things up, now waiting for results queue to drain 35374 1726882917.07414: waiting for pending results... 35374 1726882917.07560: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 35374 1726882917.07635: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000072 35374 1726882917.07643: variable 'ansible_search_path' from source: unknown 35374 1726882917.07646: variable 'ansible_search_path' from source: unknown 35374 1726882917.07677: calling self._execute() 35374 1726882917.07734: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.07737: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.07746: variable 'omit' from source: magic vars 35374 1726882917.07981: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.07991: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.08066: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.08073: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.08077: when evaluation is False, skipping this task 35374 1726882917.08080: _execute() done 35374 1726882917.08082: dumping result to json 35374 1726882917.08084: done dumping result, returning 35374 1726882917.08093: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-ee6a-9b8c-000000000072] 35374 1726882917.08097: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000072 35374 1726882917.08189: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000072 35374 1726882917.08194: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882917.08231: no more pending results, returning what we have 35374 1726882917.08233: results queue empty 35374 1726882917.08234: checking for any_errors_fatal 35374 1726882917.08239: done checking for any_errors_fatal 35374 1726882917.08240: checking for max_fail_percentage 35374 1726882917.08241: done checking for max_fail_percentage 35374 1726882917.08242: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.08243: done checking to see if all hosts have failed 35374 1726882917.08250: getting the remaining hosts for this loop 35374 1726882917.08251: done getting the remaining hosts for this loop 35374 1726882917.08254: getting the next task for host managed_node1 35374 1726882917.08260: done getting next task for host managed_node1 35374 1726882917.08263: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 35374 1726882917.08265: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882917.08277: getting variables 35374 1726882917.08278: in VariableManager get_vars() 35374 1726882917.08306: Calling all_inventory to load vars for managed_node1 35374 1726882917.08307: Calling groups_inventory to load vars for managed_node1 35374 1726882917.08309: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.08314: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.08316: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.08318: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.08449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.08571: done with get_vars() 35374 1726882917.08579: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:41:57 -0400 (0:00:00.013) 0:00:03.664 ****** 35374 1726882917.08630: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 35374 1726882917.08783: worker is 1 (out of 1 available) 35374 1726882917.08795: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 35374 1726882917.08806: done queuing things up, now waiting for results queue to drain 35374 1726882917.08808: waiting for pending results... 35374 1726882917.08950: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 35374 1726882917.09020: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000073 35374 1726882917.09033: variable 'ansible_search_path' from source: unknown 35374 1726882917.09037: variable 'ansible_search_path' from source: unknown 35374 1726882917.09063: calling self._execute() 35374 1726882917.09120: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.09124: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.09131: variable 'omit' from source: magic vars 35374 1726882917.09392: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.09401: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.09479: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.09485: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.09487: when evaluation is False, skipping this task 35374 1726882917.09490: _execute() done 35374 1726882917.09493: dumping result to json 35374 1726882917.09495: done dumping result, returning 35374 1726882917.09503: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-ee6a-9b8c-000000000073] 35374 1726882917.09508: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000073 35374 1726882917.09603: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000073 35374 1726882917.09607: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882917.09646: no more pending results, returning what we have 35374 1726882917.09649: results queue empty 35374 1726882917.09650: checking for any_errors_fatal 35374 1726882917.09654: done checking for any_errors_fatal 35374 1726882917.09655: checking for max_fail_percentage 35374 1726882917.09656: done checking for max_fail_percentage 35374 1726882917.09657: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.09657: done checking to see if all hosts have failed 35374 1726882917.09658: getting the remaining hosts for this loop 35374 1726882917.09659: done getting the remaining hosts for this loop 35374 1726882917.09661: getting the next task for host managed_node1 35374 1726882917.09670: done getting next task for host managed_node1 35374 1726882917.09674: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 35374 1726882917.09677: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882917.09691: getting variables 35374 1726882917.09692: in VariableManager get_vars() 35374 1726882917.09719: Calling all_inventory to load vars for managed_node1 35374 1726882917.09721: Calling groups_inventory to load vars for managed_node1 35374 1726882917.09723: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.09728: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.09729: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.09731: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.09832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.09954: done with get_vars() 35374 1726882917.09960: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:41:57 -0400 (0:00:00.013) 0:00:03.678 ****** 35374 1726882917.10014: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 35374 1726882917.10153: worker is 1 (out of 1 available) 35374 1726882917.10165: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 35374 1726882917.10178: done queuing things up, now waiting for results queue to drain 35374 1726882917.10179: waiting for pending results... 35374 1726882917.10324: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 35374 1726882917.10399: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000074 35374 1726882917.10408: variable 'ansible_search_path' from source: unknown 35374 1726882917.10411: variable 'ansible_search_path' from source: unknown 35374 1726882917.10435: calling self._execute() 35374 1726882917.10489: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.10493: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.10500: variable 'omit' from source: magic vars 35374 1726882917.10729: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.10739: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.10816: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.10819: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.10822: when evaluation is False, skipping this task 35374 1726882917.10826: _execute() done 35374 1726882917.10829: dumping result to json 35374 1726882917.10833: done dumping result, returning 35374 1726882917.10840: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-ee6a-9b8c-000000000074] 35374 1726882917.10843: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000074 35374 1726882917.10925: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000074 35374 1726882917.10928: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882917.10990: no more pending results, returning what we have 35374 1726882917.10993: results queue empty 35374 1726882917.10993: checking for any_errors_fatal 35374 1726882917.10997: done checking for any_errors_fatal 35374 1726882917.10998: checking for max_fail_percentage 35374 1726882917.10999: done checking for max_fail_percentage 35374 1726882917.10999: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.11000: done checking to see if all hosts have failed 35374 1726882917.11000: getting the remaining hosts for this loop 35374 1726882917.11001: done getting the remaining hosts for this loop 35374 1726882917.11003: getting the next task for host managed_node1 35374 1726882917.11006: done getting next task for host managed_node1 35374 1726882917.11009: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 35374 1726882917.11010: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882917.11019: getting variables 35374 1726882917.11020: in VariableManager get_vars() 35374 1726882917.11053: Calling all_inventory to load vars for managed_node1 35374 1726882917.11055: Calling groups_inventory to load vars for managed_node1 35374 1726882917.11056: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.11062: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.11065: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.11068: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.11197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.11314: done with get_vars() 35374 1726882917.11321: done getting variables 35374 1726882917.11356: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:41:57 -0400 (0:00:00.013) 0:00:03.692 ****** 35374 1726882917.11380: entering _queue_task() for managed_node1/debug 35374 1726882917.11517: worker is 1 (out of 1 available) 35374 1726882917.11529: exiting _queue_task() for managed_node1/debug 35374 1726882917.11539: done queuing things up, now waiting for results queue to drain 35374 1726882917.11540: waiting for pending results... 35374 1726882917.11684: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 35374 1726882917.11749: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000075 35374 1726882917.11760: variable 'ansible_search_path' from source: unknown 35374 1726882917.11763: variable 'ansible_search_path' from source: unknown 35374 1726882917.11789: calling self._execute() 35374 1726882917.11841: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.11845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.11852: variable 'omit' from source: magic vars 35374 1726882917.12084: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.12094: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.12172: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.12176: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.12179: when evaluation is False, skipping this task 35374 1726882917.12181: _execute() done 35374 1726882917.12184: dumping result to json 35374 1726882917.12186: done dumping result, returning 35374 1726882917.12193: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-ee6a-9b8c-000000000075] 35374 1726882917.12198: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000075 35374 1726882917.12283: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000075 35374 1726882917.12286: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 35374 1726882917.12341: no more pending results, returning what we have 35374 1726882917.12343: results queue empty 35374 1726882917.12344: checking for any_errors_fatal 35374 1726882917.12347: done checking for any_errors_fatal 35374 1726882917.12347: checking for max_fail_percentage 35374 1726882917.12348: done checking for max_fail_percentage 35374 1726882917.12349: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.12349: done checking to see if all hosts have failed 35374 1726882917.12350: getting the remaining hosts for this loop 35374 1726882917.12351: done getting the remaining hosts for this loop 35374 1726882917.12352: getting the next task for host managed_node1 35374 1726882917.12356: done getting next task for host managed_node1 35374 1726882917.12358: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 35374 1726882917.12360: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882917.12373: getting variables 35374 1726882917.12379: in VariableManager get_vars() 35374 1726882917.12407: Calling all_inventory to load vars for managed_node1 35374 1726882917.12409: Calling groups_inventory to load vars for managed_node1 35374 1726882917.12410: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.12415: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.12417: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.12419: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.12519: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.12639: done with get_vars() 35374 1726882917.12645: done getting variables 35374 1726882917.12685: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:41:57 -0400 (0:00:00.013) 0:00:03.705 ****** 35374 1726882917.12706: entering _queue_task() for managed_node1/debug 35374 1726882917.12842: worker is 1 (out of 1 available) 35374 1726882917.12853: exiting _queue_task() for managed_node1/debug 35374 1726882917.12865: done queuing things up, now waiting for results queue to drain 35374 1726882917.12866: waiting for pending results... 35374 1726882917.13012: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 35374 1726882917.13083: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000076 35374 1726882917.13093: variable 'ansible_search_path' from source: unknown 35374 1726882917.13096: variable 'ansible_search_path' from source: unknown 35374 1726882917.13120: calling self._execute() 35374 1726882917.13173: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.13177: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.13183: variable 'omit' from source: magic vars 35374 1726882917.13411: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.13420: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.13496: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.13500: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.13503: when evaluation is False, skipping this task 35374 1726882917.13505: _execute() done 35374 1726882917.13508: dumping result to json 35374 1726882917.13513: done dumping result, returning 35374 1726882917.13521: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-ee6a-9b8c-000000000076] 35374 1726882917.13524: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000076 35374 1726882917.13606: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000076 35374 1726882917.13608: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 35374 1726882917.13666: no more pending results, returning what we have 35374 1726882917.13670: results queue empty 35374 1726882917.13671: checking for any_errors_fatal 35374 1726882917.13674: done checking for any_errors_fatal 35374 1726882917.13675: checking for max_fail_percentage 35374 1726882917.13676: done checking for max_fail_percentage 35374 1726882917.13677: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.13677: done checking to see if all hosts have failed 35374 1726882917.13678: getting the remaining hosts for this loop 35374 1726882917.13678: done getting the remaining hosts for this loop 35374 1726882917.13680: getting the next task for host managed_node1 35374 1726882917.13684: done getting next task for host managed_node1 35374 1726882917.13686: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 35374 1726882917.13688: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882917.13697: getting variables 35374 1726882917.13698: in VariableManager get_vars() 35374 1726882917.13731: Calling all_inventory to load vars for managed_node1 35374 1726882917.13733: Calling groups_inventory to load vars for managed_node1 35374 1726882917.13734: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.13740: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.13741: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.13743: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.13876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.13994: done with get_vars() 35374 1726882917.14000: done getting variables 35374 1726882917.14035: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:41:57 -0400 (0:00:00.013) 0:00:03.719 ****** 35374 1726882917.14056: entering _queue_task() for managed_node1/debug 35374 1726882917.14195: worker is 1 (out of 1 available) 35374 1726882917.14206: exiting _queue_task() for managed_node1/debug 35374 1726882917.14216: done queuing things up, now waiting for results queue to drain 35374 1726882917.14217: waiting for pending results... 35374 1726882917.14376: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 35374 1726882917.14449: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000077 35374 1726882917.14460: variable 'ansible_search_path' from source: unknown 35374 1726882917.14465: variable 'ansible_search_path' from source: unknown 35374 1726882917.14498: calling self._execute() 35374 1726882917.14552: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.14557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.14568: variable 'omit' from source: magic vars 35374 1726882917.14822: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.14832: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.14909: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.14912: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.14915: when evaluation is False, skipping this task 35374 1726882917.14920: _execute() done 35374 1726882917.14922: dumping result to json 35374 1726882917.14925: done dumping result, returning 35374 1726882917.14931: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-ee6a-9b8c-000000000077] 35374 1726882917.14942: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000077 35374 1726882917.15019: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000077 35374 1726882917.15022: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 35374 1726882917.15073: no more pending results, returning what we have 35374 1726882917.15076: results queue empty 35374 1726882917.15077: checking for any_errors_fatal 35374 1726882917.15082: done checking for any_errors_fatal 35374 1726882917.15082: checking for max_fail_percentage 35374 1726882917.15084: done checking for max_fail_percentage 35374 1726882917.15085: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.15085: done checking to see if all hosts have failed 35374 1726882917.15086: getting the remaining hosts for this loop 35374 1726882917.15087: done getting the remaining hosts for this loop 35374 1726882917.15090: getting the next task for host managed_node1 35374 1726882917.15094: done getting next task for host managed_node1 35374 1726882917.15097: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 35374 1726882917.15100: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882917.15112: getting variables 35374 1726882917.15113: in VariableManager get_vars() 35374 1726882917.15140: Calling all_inventory to load vars for managed_node1 35374 1726882917.15141: Calling groups_inventory to load vars for managed_node1 35374 1726882917.15143: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.15152: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.15154: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.15156: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.15255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.15382: done with get_vars() 35374 1726882917.15388: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:41:57 -0400 (0:00:00.013) 0:00:03.733 ****** 35374 1726882917.15445: entering _queue_task() for managed_node1/ping 35374 1726882917.15600: worker is 1 (out of 1 available) 35374 1726882917.15613: exiting _queue_task() for managed_node1/ping 35374 1726882917.15624: done queuing things up, now waiting for results queue to drain 35374 1726882917.15625: waiting for pending results... 35374 1726882917.15765: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 35374 1726882917.15836: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000078 35374 1726882917.15847: variable 'ansible_search_path' from source: unknown 35374 1726882917.15850: variable 'ansible_search_path' from source: unknown 35374 1726882917.15877: calling self._execute() 35374 1726882917.15931: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.15934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.15941: variable 'omit' from source: magic vars 35374 1726882917.16172: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.16180: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.16260: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.16266: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.16270: when evaluation is False, skipping this task 35374 1726882917.16275: _execute() done 35374 1726882917.16278: dumping result to json 35374 1726882917.16280: done dumping result, returning 35374 1726882917.16288: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-ee6a-9b8c-000000000078] 35374 1726882917.16295: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000078 35374 1726882917.16376: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000078 35374 1726882917.16379: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882917.16428: no more pending results, returning what we have 35374 1726882917.16431: results queue empty 35374 1726882917.16432: checking for any_errors_fatal 35374 1726882917.16436: done checking for any_errors_fatal 35374 1726882917.16436: checking for max_fail_percentage 35374 1726882917.16437: done checking for max_fail_percentage 35374 1726882917.16438: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.16439: done checking to see if all hosts have failed 35374 1726882917.16440: getting the remaining hosts for this loop 35374 1726882917.16441: done getting the remaining hosts for this loop 35374 1726882917.16443: getting the next task for host managed_node1 35374 1726882917.16449: done getting next task for host managed_node1 35374 1726882917.16451: ^ task is: TASK: meta (role_complete) 35374 1726882917.16454: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882917.16466: getting variables 35374 1726882917.16469: in VariableManager get_vars() 35374 1726882917.16499: Calling all_inventory to load vars for managed_node1 35374 1726882917.16501: Calling groups_inventory to load vars for managed_node1 35374 1726882917.16502: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.16508: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.16509: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.16511: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.16612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.16760: done with get_vars() 35374 1726882917.16770: done getting variables 35374 1726882917.16822: done queuing things up, now waiting for results queue to drain 35374 1726882917.16823: results queue empty 35374 1726882917.16823: checking for any_errors_fatal 35374 1726882917.16825: done checking for any_errors_fatal 35374 1726882917.16825: checking for max_fail_percentage 35374 1726882917.16826: done checking for max_fail_percentage 35374 1726882917.16826: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.16827: done checking to see if all hosts have failed 35374 1726882917.16827: getting the remaining hosts for this loop 35374 1726882917.16828: done getting the remaining hosts for this loop 35374 1726882917.16829: getting the next task for host managed_node1 35374 1726882917.16831: done getting next task for host managed_node1 35374 1726882917.16832: ^ task is: TASK: TEST: wireless connection with 802.1x TLS-EAP 35374 1726882917.16833: ^ state is: HOST STATE: block=3, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882917.16835: getting variables 35374 1726882917.16835: in VariableManager get_vars() 35374 1726882917.16845: Calling all_inventory to load vars for managed_node1 35374 1726882917.16846: Calling groups_inventory to load vars for managed_node1 35374 1726882917.16847: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.16850: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.16851: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.16853: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.16934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.17045: done with get_vars() 35374 1726882917.17050: done getting variables 35374 1726882917.17080: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [TEST: wireless connection with 802.1x TLS-EAP] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:53 Friday 20 September 2024 21:41:57 -0400 (0:00:00.016) 0:00:03.749 ****** 35374 1726882917.17096: entering _queue_task() for managed_node1/debug 35374 1726882917.17245: worker is 1 (out of 1 available) 35374 1726882917.17258: exiting _queue_task() for managed_node1/debug 35374 1726882917.17274: done queuing things up, now waiting for results queue to drain 35374 1726882917.17276: waiting for pending results... 35374 1726882917.17408: running TaskExecutor() for managed_node1/TASK: TEST: wireless connection with 802.1x TLS-EAP 35374 1726882917.17454: in run() - task 0e448fcc-3ce9-ee6a-9b8c-0000000000a8 35374 1726882917.17468: variable 'ansible_search_path' from source: unknown 35374 1726882917.17498: calling self._execute() 35374 1726882917.17557: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.17566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.17577: variable 'omit' from source: magic vars 35374 1726882917.17828: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.17837: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.17918: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.17922: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.17926: when evaluation is False, skipping this task 35374 1726882917.17930: _execute() done 35374 1726882917.17932: dumping result to json 35374 1726882917.17938: done dumping result, returning 35374 1726882917.17943: done running TaskExecutor() for managed_node1/TASK: TEST: wireless connection with 802.1x TLS-EAP [0e448fcc-3ce9-ee6a-9b8c-0000000000a8] 35374 1726882917.17950: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000a8 35374 1726882917.18027: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000a8 35374 1726882917.18030: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 35374 1726882917.18088: no more pending results, returning what we have 35374 1726882917.18091: results queue empty 35374 1726882917.18092: checking for any_errors_fatal 35374 1726882917.18094: done checking for any_errors_fatal 35374 1726882917.18095: checking for max_fail_percentage 35374 1726882917.18096: done checking for max_fail_percentage 35374 1726882917.18097: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.18098: done checking to see if all hosts have failed 35374 1726882917.18098: getting the remaining hosts for this loop 35374 1726882917.18100: done getting the remaining hosts for this loop 35374 1726882917.18102: getting the next task for host managed_node1 35374 1726882917.18107: done getting next task for host managed_node1 35374 1726882917.18111: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 35374 1726882917.18114: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882917.18126: getting variables 35374 1726882917.18128: in VariableManager get_vars() 35374 1726882917.18162: Calling all_inventory to load vars for managed_node1 35374 1726882917.18165: Calling groups_inventory to load vars for managed_node1 35374 1726882917.18167: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.18173: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.18175: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.18176: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.18278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.18425: done with get_vars() 35374 1726882917.18432: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:41:57 -0400 (0:00:00.013) 0:00:03.763 ****** 35374 1726882917.18494: entering _queue_task() for managed_node1/include_tasks 35374 1726882917.18644: worker is 1 (out of 1 available) 35374 1726882917.18656: exiting _queue_task() for managed_node1/include_tasks 35374 1726882917.18668: done queuing things up, now waiting for results queue to drain 35374 1726882917.18669: waiting for pending results... 35374 1726882917.18818: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 35374 1726882917.18893: in run() - task 0e448fcc-3ce9-ee6a-9b8c-0000000000b0 35374 1726882917.18903: variable 'ansible_search_path' from source: unknown 35374 1726882917.18912: variable 'ansible_search_path' from source: unknown 35374 1726882917.18936: calling self._execute() 35374 1726882917.18994: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.18997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.19008: variable 'omit' from source: magic vars 35374 1726882917.19245: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.19255: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.19332: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.19335: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.19338: when evaluation is False, skipping this task 35374 1726882917.19343: _execute() done 35374 1726882917.19346: dumping result to json 35374 1726882917.19348: done dumping result, returning 35374 1726882917.19353: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-ee6a-9b8c-0000000000b0] 35374 1726882917.19369: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000b0 35374 1726882917.19448: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000b0 35374 1726882917.19451: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882917.19510: no more pending results, returning what we have 35374 1726882917.19513: results queue empty 35374 1726882917.19514: checking for any_errors_fatal 35374 1726882917.19518: done checking for any_errors_fatal 35374 1726882917.19518: checking for max_fail_percentage 35374 1726882917.19520: done checking for max_fail_percentage 35374 1726882917.19520: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.19521: done checking to see if all hosts have failed 35374 1726882917.19522: getting the remaining hosts for this loop 35374 1726882917.19523: done getting the remaining hosts for this loop 35374 1726882917.19526: getting the next task for host managed_node1 35374 1726882917.19530: done getting next task for host managed_node1 35374 1726882917.19534: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 35374 1726882917.19536: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882917.19546: getting variables 35374 1726882917.19547: in VariableManager get_vars() 35374 1726882917.19579: Calling all_inventory to load vars for managed_node1 35374 1726882917.19581: Calling groups_inventory to load vars for managed_node1 35374 1726882917.19582: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.19588: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.19589: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.19591: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.19694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.19816: done with get_vars() 35374 1726882917.19823: done getting variables 35374 1726882917.19858: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:41:57 -0400 (0:00:00.013) 0:00:03.777 ****** 35374 1726882917.19881: entering _queue_task() for managed_node1/debug 35374 1726882917.20038: worker is 1 (out of 1 available) 35374 1726882917.20051: exiting _queue_task() for managed_node1/debug 35374 1726882917.20062: done queuing things up, now waiting for results queue to drain 35374 1726882917.20065: waiting for pending results... 35374 1726882917.20210: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 35374 1726882917.20282: in run() - task 0e448fcc-3ce9-ee6a-9b8c-0000000000b1 35374 1726882917.20292: variable 'ansible_search_path' from source: unknown 35374 1726882917.20296: variable 'ansible_search_path' from source: unknown 35374 1726882917.20320: calling self._execute() 35374 1726882917.20376: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.20379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.20387: variable 'omit' from source: magic vars 35374 1726882917.20616: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.20626: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.20704: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.20707: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.20711: when evaluation is False, skipping this task 35374 1726882917.20715: _execute() done 35374 1726882917.20719: dumping result to json 35374 1726882917.20722: done dumping result, returning 35374 1726882917.20727: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-ee6a-9b8c-0000000000b1] 35374 1726882917.20735: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000b1 35374 1726882917.20810: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000b1 35374 1726882917.20813: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 35374 1726882917.20870: no more pending results, returning what we have 35374 1726882917.20874: results queue empty 35374 1726882917.20875: checking for any_errors_fatal 35374 1726882917.20878: done checking for any_errors_fatal 35374 1726882917.20879: checking for max_fail_percentage 35374 1726882917.20881: done checking for max_fail_percentage 35374 1726882917.20881: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.20882: done checking to see if all hosts have failed 35374 1726882917.20883: getting the remaining hosts for this loop 35374 1726882917.20884: done getting the remaining hosts for this loop 35374 1726882917.20887: getting the next task for host managed_node1 35374 1726882917.20891: done getting next task for host managed_node1 35374 1726882917.20895: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 35374 1726882917.20897: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882917.20911: getting variables 35374 1726882917.20913: in VariableManager get_vars() 35374 1726882917.20941: Calling all_inventory to load vars for managed_node1 35374 1726882917.20943: Calling groups_inventory to load vars for managed_node1 35374 1726882917.20944: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.20949: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.20951: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.20952: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.21108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.21289: done with get_vars() 35374 1726882917.21298: done getting variables 35374 1726882917.21347: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:41:57 -0400 (0:00:00.014) 0:00:03.792 ****** 35374 1726882917.21376: entering _queue_task() for managed_node1/fail 35374 1726882917.21569: worker is 1 (out of 1 available) 35374 1726882917.21582: exiting _queue_task() for managed_node1/fail 35374 1726882917.21592: done queuing things up, now waiting for results queue to drain 35374 1726882917.21593: waiting for pending results... 35374 1726882917.21830: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 35374 1726882917.21947: in run() - task 0e448fcc-3ce9-ee6a-9b8c-0000000000b2 35374 1726882917.21969: variable 'ansible_search_path' from source: unknown 35374 1726882917.21980: variable 'ansible_search_path' from source: unknown 35374 1726882917.22019: calling self._execute() 35374 1726882917.22100: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.22111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.22124: variable 'omit' from source: magic vars 35374 1726882917.22464: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.22488: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.22610: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.22621: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.22627: when evaluation is False, skipping this task 35374 1726882917.22634: _execute() done 35374 1726882917.22639: dumping result to json 35374 1726882917.22646: done dumping result, returning 35374 1726882917.22655: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-ee6a-9b8c-0000000000b2] 35374 1726882917.22668: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000b2 35374 1726882917.22765: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000b2 35374 1726882917.22769: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882917.22851: no more pending results, returning what we have 35374 1726882917.22854: results queue empty 35374 1726882917.22854: checking for any_errors_fatal 35374 1726882917.22859: done checking for any_errors_fatal 35374 1726882917.22859: checking for max_fail_percentage 35374 1726882917.22861: done checking for max_fail_percentage 35374 1726882917.22862: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.22863: done checking to see if all hosts have failed 35374 1726882917.22865: getting the remaining hosts for this loop 35374 1726882917.22867: done getting the remaining hosts for this loop 35374 1726882917.22870: getting the next task for host managed_node1 35374 1726882917.22874: done getting next task for host managed_node1 35374 1726882917.22878: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 35374 1726882917.22880: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882917.22893: getting variables 35374 1726882917.22894: in VariableManager get_vars() 35374 1726882917.22927: Calling all_inventory to load vars for managed_node1 35374 1726882917.22930: Calling groups_inventory to load vars for managed_node1 35374 1726882917.22931: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.22937: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.22939: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.22940: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.23050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.23172: done with get_vars() 35374 1726882917.23179: done getting variables 35374 1726882917.23214: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:41:57 -0400 (0:00:00.018) 0:00:03.811 ****** 35374 1726882917.23236: entering _queue_task() for managed_node1/fail 35374 1726882917.23386: worker is 1 (out of 1 available) 35374 1726882917.23400: exiting _queue_task() for managed_node1/fail 35374 1726882917.23411: done queuing things up, now waiting for results queue to drain 35374 1726882917.23413: waiting for pending results... 35374 1726882917.23554: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 35374 1726882917.23629: in run() - task 0e448fcc-3ce9-ee6a-9b8c-0000000000b3 35374 1726882917.23639: variable 'ansible_search_path' from source: unknown 35374 1726882917.23643: variable 'ansible_search_path' from source: unknown 35374 1726882917.23673: calling self._execute() 35374 1726882917.23733: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.23736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.23745: variable 'omit' from source: magic vars 35374 1726882917.24000: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.24012: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.24093: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.24096: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.24099: when evaluation is False, skipping this task 35374 1726882917.24108: _execute() done 35374 1726882917.24111: dumping result to json 35374 1726882917.24114: done dumping result, returning 35374 1726882917.24120: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-ee6a-9b8c-0000000000b3] 35374 1726882917.24129: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000b3 35374 1726882917.24211: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000b3 35374 1726882917.24215: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882917.24257: no more pending results, returning what we have 35374 1726882917.24260: results queue empty 35374 1726882917.24261: checking for any_errors_fatal 35374 1726882917.24267: done checking for any_errors_fatal 35374 1726882917.24268: checking for max_fail_percentage 35374 1726882917.24269: done checking for max_fail_percentage 35374 1726882917.24270: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.24271: done checking to see if all hosts have failed 35374 1726882917.24272: getting the remaining hosts for this loop 35374 1726882917.24273: done getting the remaining hosts for this loop 35374 1726882917.24276: getting the next task for host managed_node1 35374 1726882917.24281: done getting next task for host managed_node1 35374 1726882917.24285: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 35374 1726882917.24287: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882917.24301: getting variables 35374 1726882917.24302: in VariableManager get_vars() 35374 1726882917.24334: Calling all_inventory to load vars for managed_node1 35374 1726882917.24336: Calling groups_inventory to load vars for managed_node1 35374 1726882917.24338: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.24344: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.24345: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.24347: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.24481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.24603: done with get_vars() 35374 1726882917.24612: done getting variables 35374 1726882917.24653: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:41:57 -0400 (0:00:00.014) 0:00:03.825 ****** 35374 1726882917.24678: entering _queue_task() for managed_node1/fail 35374 1726882917.24858: worker is 1 (out of 1 available) 35374 1726882917.24873: exiting _queue_task() for managed_node1/fail 35374 1726882917.24899: done queuing things up, now waiting for results queue to drain 35374 1726882917.24901: waiting for pending results... 35374 1726882917.25045: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 35374 1726882917.25132: in run() - task 0e448fcc-3ce9-ee6a-9b8c-0000000000b4 35374 1726882917.25143: variable 'ansible_search_path' from source: unknown 35374 1726882917.25146: variable 'ansible_search_path' from source: unknown 35374 1726882917.25183: calling self._execute() 35374 1726882917.25235: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.25239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.25250: variable 'omit' from source: magic vars 35374 1726882917.25545: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.25562: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.25677: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.25689: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.25696: when evaluation is False, skipping this task 35374 1726882917.25703: _execute() done 35374 1726882917.25710: dumping result to json 35374 1726882917.25718: done dumping result, returning 35374 1726882917.25730: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-ee6a-9b8c-0000000000b4] 35374 1726882917.25739: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000b4 35374 1726882917.25839: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000b4 35374 1726882917.25847: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882917.25903: no more pending results, returning what we have 35374 1726882917.25907: results queue empty 35374 1726882917.25908: checking for any_errors_fatal 35374 1726882917.25914: done checking for any_errors_fatal 35374 1726882917.25915: checking for max_fail_percentage 35374 1726882917.25916: done checking for max_fail_percentage 35374 1726882917.25917: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.25918: done checking to see if all hosts have failed 35374 1726882917.25919: getting the remaining hosts for this loop 35374 1726882917.25920: done getting the remaining hosts for this loop 35374 1726882917.25924: getting the next task for host managed_node1 35374 1726882917.25938: done getting next task for host managed_node1 35374 1726882917.25943: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 35374 1726882917.25945: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882917.25959: getting variables 35374 1726882917.25961: in VariableManager get_vars() 35374 1726882917.26000: Calling all_inventory to load vars for managed_node1 35374 1726882917.26002: Calling groups_inventory to load vars for managed_node1 35374 1726882917.26005: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.26012: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.26015: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.26018: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.26193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.26415: done with get_vars() 35374 1726882917.26433: done getting variables 35374 1726882917.26514: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:41:57 -0400 (0:00:00.018) 0:00:03.844 ****** 35374 1726882917.26541: entering _queue_task() for managed_node1/dnf 35374 1726882917.26727: worker is 1 (out of 1 available) 35374 1726882917.26740: exiting _queue_task() for managed_node1/dnf 35374 1726882917.26751: done queuing things up, now waiting for results queue to drain 35374 1726882917.26753: waiting for pending results... 35374 1726882917.26915: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 35374 1726882917.26992: in run() - task 0e448fcc-3ce9-ee6a-9b8c-0000000000b5 35374 1726882917.27005: variable 'ansible_search_path' from source: unknown 35374 1726882917.27008: variable 'ansible_search_path' from source: unknown 35374 1726882917.27038: calling self._execute() 35374 1726882917.27093: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.27096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.27104: variable 'omit' from source: magic vars 35374 1726882917.27341: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.27352: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.27429: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.27433: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.27437: when evaluation is False, skipping this task 35374 1726882917.27440: _execute() done 35374 1726882917.27443: dumping result to json 35374 1726882917.27447: done dumping result, returning 35374 1726882917.27454: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-ee6a-9b8c-0000000000b5] 35374 1726882917.27459: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000b5 35374 1726882917.27547: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000b5 35374 1726882917.27550: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882917.27604: no more pending results, returning what we have 35374 1726882917.27606: results queue empty 35374 1726882917.27607: checking for any_errors_fatal 35374 1726882917.27611: done checking for any_errors_fatal 35374 1726882917.27612: checking for max_fail_percentage 35374 1726882917.27613: done checking for max_fail_percentage 35374 1726882917.27614: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.27614: done checking to see if all hosts have failed 35374 1726882917.27615: getting the remaining hosts for this loop 35374 1726882917.27616: done getting the remaining hosts for this loop 35374 1726882917.27619: getting the next task for host managed_node1 35374 1726882917.27624: done getting next task for host managed_node1 35374 1726882917.27627: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 35374 1726882917.27629: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882917.27640: getting variables 35374 1726882917.27641: in VariableManager get_vars() 35374 1726882917.27672: Calling all_inventory to load vars for managed_node1 35374 1726882917.27674: Calling groups_inventory to load vars for managed_node1 35374 1726882917.27675: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.27680: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.27682: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.27690: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.27821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.27939: done with get_vars() 35374 1726882917.27945: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 35374 1726882917.27998: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:41:57 -0400 (0:00:00.014) 0:00:03.858 ****** 35374 1726882917.28019: entering _queue_task() for managed_node1/yum 35374 1726882917.28161: worker is 1 (out of 1 available) 35374 1726882917.28177: exiting _queue_task() for managed_node1/yum 35374 1726882917.28187: done queuing things up, now waiting for results queue to drain 35374 1726882917.28188: waiting for pending results... 35374 1726882917.28334: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 35374 1726882917.28410: in run() - task 0e448fcc-3ce9-ee6a-9b8c-0000000000b6 35374 1726882917.28457: variable 'ansible_search_path' from source: unknown 35374 1726882917.28460: variable 'ansible_search_path' from source: unknown 35374 1726882917.28485: calling self._execute() 35374 1726882917.28558: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.28578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.28591: variable 'omit' from source: magic vars 35374 1726882917.29205: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.29232: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.29365: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.29379: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.29388: when evaluation is False, skipping this task 35374 1726882917.29395: _execute() done 35374 1726882917.29400: dumping result to json 35374 1726882917.29407: done dumping result, returning 35374 1726882917.29417: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-ee6a-9b8c-0000000000b6] 35374 1726882917.29426: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000b6 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882917.29582: no more pending results, returning what we have 35374 1726882917.29585: results queue empty 35374 1726882917.29590: checking for any_errors_fatal 35374 1726882917.29595: done checking for any_errors_fatal 35374 1726882917.29596: checking for max_fail_percentage 35374 1726882917.29598: done checking for max_fail_percentage 35374 1726882917.29599: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.29600: done checking to see if all hosts have failed 35374 1726882917.29600: getting the remaining hosts for this loop 35374 1726882917.29602: done getting the remaining hosts for this loop 35374 1726882917.29605: getting the next task for host managed_node1 35374 1726882917.29611: done getting next task for host managed_node1 35374 1726882917.29616: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 35374 1726882917.29618: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882917.29627: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000b6 35374 1726882917.29630: WORKER PROCESS EXITING 35374 1726882917.29642: getting variables 35374 1726882917.29644: in VariableManager get_vars() 35374 1726882917.29684: Calling all_inventory to load vars for managed_node1 35374 1726882917.29686: Calling groups_inventory to load vars for managed_node1 35374 1726882917.29688: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.29699: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.29701: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.29704: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.29878: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.30110: done with get_vars() 35374 1726882917.30120: done getting variables 35374 1726882917.30176: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:41:57 -0400 (0:00:00.021) 0:00:03.880 ****** 35374 1726882917.30203: entering _queue_task() for managed_node1/fail 35374 1726882917.30403: worker is 1 (out of 1 available) 35374 1726882917.30416: exiting _queue_task() for managed_node1/fail 35374 1726882917.30428: done queuing things up, now waiting for results queue to drain 35374 1726882917.30429: waiting for pending results... 35374 1726882917.30684: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 35374 1726882917.30803: in run() - task 0e448fcc-3ce9-ee6a-9b8c-0000000000b7 35374 1726882917.30820: variable 'ansible_search_path' from source: unknown 35374 1726882917.30829: variable 'ansible_search_path' from source: unknown 35374 1726882917.30868: calling self._execute() 35374 1726882917.30944: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.30954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.30970: variable 'omit' from source: magic vars 35374 1726882917.31310: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.31327: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.31447: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.31460: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.31469: when evaluation is False, skipping this task 35374 1726882917.31478: _execute() done 35374 1726882917.31484: dumping result to json 35374 1726882917.31491: done dumping result, returning 35374 1726882917.31501: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-ee6a-9b8c-0000000000b7] 35374 1726882917.31511: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000b7 35374 1726882917.31622: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000b7 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882917.31679: no more pending results, returning what we have 35374 1726882917.31683: results queue empty 35374 1726882917.31684: checking for any_errors_fatal 35374 1726882917.31689: done checking for any_errors_fatal 35374 1726882917.31690: checking for max_fail_percentage 35374 1726882917.31692: done checking for max_fail_percentage 35374 1726882917.31693: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.31694: done checking to see if all hosts have failed 35374 1726882917.31695: getting the remaining hosts for this loop 35374 1726882917.31696: done getting the remaining hosts for this loop 35374 1726882917.31700: getting the next task for host managed_node1 35374 1726882917.31707: done getting next task for host managed_node1 35374 1726882917.31710: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 35374 1726882917.31713: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882917.31733: getting variables 35374 1726882917.31735: in VariableManager get_vars() 35374 1726882917.31784: Calling all_inventory to load vars for managed_node1 35374 1726882917.31787: Calling groups_inventory to load vars for managed_node1 35374 1726882917.31790: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.31801: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.31804: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.31807: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.32040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.32255: done with get_vars() 35374 1726882917.32266: done getting variables 35374 1726882917.32420: WORKER PROCESS EXITING 35374 1726882917.32458: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:41:57 -0400 (0:00:00.022) 0:00:03.903 ****** 35374 1726882917.32496: entering _queue_task() for managed_node1/package 35374 1726882917.32815: worker is 1 (out of 1 available) 35374 1726882917.32833: exiting _queue_task() for managed_node1/package 35374 1726882917.32844: done queuing things up, now waiting for results queue to drain 35374 1726882917.32845: waiting for pending results... 35374 1726882917.33097: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 35374 1726882917.33226: in run() - task 0e448fcc-3ce9-ee6a-9b8c-0000000000b8 35374 1726882917.33244: variable 'ansible_search_path' from source: unknown 35374 1726882917.33252: variable 'ansible_search_path' from source: unknown 35374 1726882917.33297: calling self._execute() 35374 1726882917.33376: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.33388: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.33406: variable 'omit' from source: magic vars 35374 1726882917.33767: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.33787: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.33907: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.33922: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.33929: when evaluation is False, skipping this task 35374 1726882917.33940: _execute() done 35374 1726882917.33946: dumping result to json 35374 1726882917.33953: done dumping result, returning 35374 1726882917.33962: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-ee6a-9b8c-0000000000b8] 35374 1726882917.33974: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000b8 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882917.34116: no more pending results, returning what we have 35374 1726882917.34121: results queue empty 35374 1726882917.34122: checking for any_errors_fatal 35374 1726882917.34129: done checking for any_errors_fatal 35374 1726882917.34130: checking for max_fail_percentage 35374 1726882917.34131: done checking for max_fail_percentage 35374 1726882917.34132: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.34133: done checking to see if all hosts have failed 35374 1726882917.34134: getting the remaining hosts for this loop 35374 1726882917.34135: done getting the remaining hosts for this loop 35374 1726882917.34139: getting the next task for host managed_node1 35374 1726882917.34146: done getting next task for host managed_node1 35374 1726882917.34150: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 35374 1726882917.34152: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882917.34172: getting variables 35374 1726882917.34174: in VariableManager get_vars() 35374 1726882917.34220: Calling all_inventory to load vars for managed_node1 35374 1726882917.34223: Calling groups_inventory to load vars for managed_node1 35374 1726882917.34225: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.34236: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.34239: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.34242: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.34411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.34612: done with get_vars() 35374 1726882917.34622: done getting variables 35374 1726882917.34703: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:41:57 -0400 (0:00:00.022) 0:00:03.926 ****** 35374 1726882917.34740: entering _queue_task() for managed_node1/package 35374 1726882917.34758: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000b8 35374 1726882917.34766: WORKER PROCESS EXITING 35374 1726882917.35179: worker is 1 (out of 1 available) 35374 1726882917.35191: exiting _queue_task() for managed_node1/package 35374 1726882917.35202: done queuing things up, now waiting for results queue to drain 35374 1726882917.35203: waiting for pending results... 35374 1726882917.35452: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 35374 1726882917.35580: in run() - task 0e448fcc-3ce9-ee6a-9b8c-0000000000b9 35374 1726882917.35600: variable 'ansible_search_path' from source: unknown 35374 1726882917.35609: variable 'ansible_search_path' from source: unknown 35374 1726882917.35649: calling self._execute() 35374 1726882917.35727: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.35738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.35754: variable 'omit' from source: magic vars 35374 1726882917.36099: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.36117: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.36233: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.36244: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.36251: when evaluation is False, skipping this task 35374 1726882917.36258: _execute() done 35374 1726882917.36266: dumping result to json 35374 1726882917.36274: done dumping result, returning 35374 1726882917.36284: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-ee6a-9b8c-0000000000b9] 35374 1726882917.36298: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000b9 35374 1726882917.36403: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000b9 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882917.36445: no more pending results, returning what we have 35374 1726882917.36449: results queue empty 35374 1726882917.36450: checking for any_errors_fatal 35374 1726882917.36454: done checking for any_errors_fatal 35374 1726882917.36455: checking for max_fail_percentage 35374 1726882917.36456: done checking for max_fail_percentage 35374 1726882917.36457: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.36459: done checking to see if all hosts have failed 35374 1726882917.36459: getting the remaining hosts for this loop 35374 1726882917.36461: done getting the remaining hosts for this loop 35374 1726882917.36466: getting the next task for host managed_node1 35374 1726882917.36473: done getting next task for host managed_node1 35374 1726882917.36478: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 35374 1726882917.36480: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882917.36500: getting variables 35374 1726882917.36502: in VariableManager get_vars() 35374 1726882917.36549: Calling all_inventory to load vars for managed_node1 35374 1726882917.36553: Calling groups_inventory to load vars for managed_node1 35374 1726882917.36555: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.36569: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.36572: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.36576: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.37025: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.37401: done with get_vars() 35374 1726882917.37411: done getting variables 35374 1726882917.37472: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 35374 1726882917.37495: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:41:57 -0400 (0:00:00.027) 0:00:03.953 ****** 35374 1726882917.37507: entering _queue_task() for managed_node1/package 35374 1726882917.37742: worker is 1 (out of 1 available) 35374 1726882917.37758: exiting _queue_task() for managed_node1/package 35374 1726882917.37777: done queuing things up, now waiting for results queue to drain 35374 1726882917.37779: waiting for pending results... 35374 1726882917.38052: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 35374 1726882917.38199: in run() - task 0e448fcc-3ce9-ee6a-9b8c-0000000000ba 35374 1726882917.38222: variable 'ansible_search_path' from source: unknown 35374 1726882917.38231: variable 'ansible_search_path' from source: unknown 35374 1726882917.38272: calling self._execute() 35374 1726882917.38365: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.38380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.38394: variable 'omit' from source: magic vars 35374 1726882917.38768: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.38783: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.38860: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.38866: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.38869: when evaluation is False, skipping this task 35374 1726882917.38875: _execute() done 35374 1726882917.38881: dumping result to json 35374 1726882917.38889: done dumping result, returning 35374 1726882917.38898: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-ee6a-9b8c-0000000000ba] 35374 1726882917.38901: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000ba 35374 1726882917.39003: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000ba 35374 1726882917.39006: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882917.39048: no more pending results, returning what we have 35374 1726882917.39051: results queue empty 35374 1726882917.39052: checking for any_errors_fatal 35374 1726882917.39058: done checking for any_errors_fatal 35374 1726882917.39059: checking for max_fail_percentage 35374 1726882917.39060: done checking for max_fail_percentage 35374 1726882917.39061: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.39062: done checking to see if all hosts have failed 35374 1726882917.39063: getting the remaining hosts for this loop 35374 1726882917.39067: done getting the remaining hosts for this loop 35374 1726882917.39070: getting the next task for host managed_node1 35374 1726882917.39076: done getting next task for host managed_node1 35374 1726882917.39080: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 35374 1726882917.39082: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882917.39096: getting variables 35374 1726882917.39098: in VariableManager get_vars() 35374 1726882917.39131: Calling all_inventory to load vars for managed_node1 35374 1726882917.39133: Calling groups_inventory to load vars for managed_node1 35374 1726882917.39134: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.39140: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.39142: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.39144: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.39259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.39390: done with get_vars() 35374 1726882917.39397: done getting variables 35374 1726882917.39442: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:41:57 -0400 (0:00:00.019) 0:00:03.973 ****** 35374 1726882917.39466: entering _queue_task() for managed_node1/service 35374 1726882917.39624: worker is 1 (out of 1 available) 35374 1726882917.39637: exiting _queue_task() for managed_node1/service 35374 1726882917.39646: done queuing things up, now waiting for results queue to drain 35374 1726882917.39647: waiting for pending results... 35374 1726882917.39802: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 35374 1726882917.39885: in run() - task 0e448fcc-3ce9-ee6a-9b8c-0000000000bb 35374 1726882917.39893: variable 'ansible_search_path' from source: unknown 35374 1726882917.39897: variable 'ansible_search_path' from source: unknown 35374 1726882917.39923: calling self._execute() 35374 1726882917.39983: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.39988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.40001: variable 'omit' from source: magic vars 35374 1726882917.40268: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.40281: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.40356: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.40360: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.40364: when evaluation is False, skipping this task 35374 1726882917.40368: _execute() done 35374 1726882917.40569: dumping result to json 35374 1726882917.40573: done dumping result, returning 35374 1726882917.40576: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-ee6a-9b8c-0000000000bb] 35374 1726882917.40578: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000bb 35374 1726882917.40637: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000bb 35374 1726882917.40640: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882917.40682: no more pending results, returning what we have 35374 1726882917.40684: results queue empty 35374 1726882917.40685: checking for any_errors_fatal 35374 1726882917.40688: done checking for any_errors_fatal 35374 1726882917.40688: checking for max_fail_percentage 35374 1726882917.40690: done checking for max_fail_percentage 35374 1726882917.40690: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.40691: done checking to see if all hosts have failed 35374 1726882917.40691: getting the remaining hosts for this loop 35374 1726882917.40692: done getting the remaining hosts for this loop 35374 1726882917.40694: getting the next task for host managed_node1 35374 1726882917.40698: done getting next task for host managed_node1 35374 1726882917.40700: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 35374 1726882917.40702: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882917.40712: getting variables 35374 1726882917.40713: in VariableManager get_vars() 35374 1726882917.40742: Calling all_inventory to load vars for managed_node1 35374 1726882917.40743: Calling groups_inventory to load vars for managed_node1 35374 1726882917.40745: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.40755: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.40758: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.40760: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.40910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.41028: done with get_vars() 35374 1726882917.41035: done getting variables 35374 1726882917.41079: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:41:57 -0400 (0:00:00.016) 0:00:03.989 ****** 35374 1726882917.41100: entering _queue_task() for managed_node1/service 35374 1726882917.41260: worker is 1 (out of 1 available) 35374 1726882917.41278: exiting _queue_task() for managed_node1/service 35374 1726882917.41290: done queuing things up, now waiting for results queue to drain 35374 1726882917.41291: waiting for pending results... 35374 1726882917.41467: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 35374 1726882917.41582: in run() - task 0e448fcc-3ce9-ee6a-9b8c-0000000000bc 35374 1726882917.41600: variable 'ansible_search_path' from source: unknown 35374 1726882917.41607: variable 'ansible_search_path' from source: unknown 35374 1726882917.41639: calling self._execute() 35374 1726882917.41718: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.41727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.41737: variable 'omit' from source: magic vars 35374 1726882917.42066: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.42085: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.42190: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.42194: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.42197: when evaluation is False, skipping this task 35374 1726882917.42199: _execute() done 35374 1726882917.42204: dumping result to json 35374 1726882917.42206: done dumping result, returning 35374 1726882917.42211: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-ee6a-9b8c-0000000000bc] 35374 1726882917.42221: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000bc 35374 1726882917.42304: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000bc 35374 1726882917.42307: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 35374 1726882917.42374: no more pending results, returning what we have 35374 1726882917.42378: results queue empty 35374 1726882917.42378: checking for any_errors_fatal 35374 1726882917.42383: done checking for any_errors_fatal 35374 1726882917.42383: checking for max_fail_percentage 35374 1726882917.42385: done checking for max_fail_percentage 35374 1726882917.42385: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.42386: done checking to see if all hosts have failed 35374 1726882917.42387: getting the remaining hosts for this loop 35374 1726882917.42388: done getting the remaining hosts for this loop 35374 1726882917.42391: getting the next task for host managed_node1 35374 1726882917.42395: done getting next task for host managed_node1 35374 1726882917.42397: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 35374 1726882917.42399: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882917.42409: getting variables 35374 1726882917.42410: in VariableManager get_vars() 35374 1726882917.42441: Calling all_inventory to load vars for managed_node1 35374 1726882917.42443: Calling groups_inventory to load vars for managed_node1 35374 1726882917.42445: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.42450: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.42452: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.42454: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.42557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.42688: done with get_vars() 35374 1726882917.42694: done getting variables 35374 1726882917.42732: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:41:57 -0400 (0:00:00.016) 0:00:04.006 ****** 35374 1726882917.42752: entering _queue_task() for managed_node1/service 35374 1726882917.42908: worker is 1 (out of 1 available) 35374 1726882917.42921: exiting _queue_task() for managed_node1/service 35374 1726882917.42932: done queuing things up, now waiting for results queue to drain 35374 1726882917.42933: waiting for pending results... 35374 1726882917.43074: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 35374 1726882917.43139: in run() - task 0e448fcc-3ce9-ee6a-9b8c-0000000000bd 35374 1726882917.43149: variable 'ansible_search_path' from source: unknown 35374 1726882917.43154: variable 'ansible_search_path' from source: unknown 35374 1726882917.43181: calling self._execute() 35374 1726882917.43238: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.43242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.43249: variable 'omit' from source: magic vars 35374 1726882917.43488: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.43498: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.43572: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.43583: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.43586: when evaluation is False, skipping this task 35374 1726882917.43589: _execute() done 35374 1726882917.43592: dumping result to json 35374 1726882917.43594: done dumping result, returning 35374 1726882917.43600: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-ee6a-9b8c-0000000000bd] 35374 1726882917.43606: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000bd 35374 1726882917.43699: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000bd 35374 1726882917.43702: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882917.43749: no more pending results, returning what we have 35374 1726882917.43751: results queue empty 35374 1726882917.43752: checking for any_errors_fatal 35374 1726882917.43756: done checking for any_errors_fatal 35374 1726882917.43757: checking for max_fail_percentage 35374 1726882917.43758: done checking for max_fail_percentage 35374 1726882917.43759: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.43760: done checking to see if all hosts have failed 35374 1726882917.43760: getting the remaining hosts for this loop 35374 1726882917.43762: done getting the remaining hosts for this loop 35374 1726882917.43767: getting the next task for host managed_node1 35374 1726882917.43772: done getting next task for host managed_node1 35374 1726882917.43775: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 35374 1726882917.43778: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882917.43788: getting variables 35374 1726882917.43789: in VariableManager get_vars() 35374 1726882917.43816: Calling all_inventory to load vars for managed_node1 35374 1726882917.43818: Calling groups_inventory to load vars for managed_node1 35374 1726882917.43819: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.43825: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.43826: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.43828: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.43966: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.44085: done with get_vars() 35374 1726882917.44092: done getting variables 35374 1726882917.44129: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:41:57 -0400 (0:00:00.013) 0:00:04.020 ****** 35374 1726882917.44147: entering _queue_task() for managed_node1/service 35374 1726882917.44302: worker is 1 (out of 1 available) 35374 1726882917.44315: exiting _queue_task() for managed_node1/service 35374 1726882917.44325: done queuing things up, now waiting for results queue to drain 35374 1726882917.44327: waiting for pending results... 35374 1726882917.44470: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 35374 1726882917.44535: in run() - task 0e448fcc-3ce9-ee6a-9b8c-0000000000be 35374 1726882917.44546: variable 'ansible_search_path' from source: unknown 35374 1726882917.44550: variable 'ansible_search_path' from source: unknown 35374 1726882917.44577: calling self._execute() 35374 1726882917.44634: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.44638: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.44645: variable 'omit' from source: magic vars 35374 1726882917.44880: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.44889: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.44961: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.44966: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.44971: when evaluation is False, skipping this task 35374 1726882917.44976: _execute() done 35374 1726882917.44979: dumping result to json 35374 1726882917.44981: done dumping result, returning 35374 1726882917.44984: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-ee6a-9b8c-0000000000be] 35374 1726882917.44989: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000be 35374 1726882917.45070: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000be 35374 1726882917.45073: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 35374 1726882917.45131: no more pending results, returning what we have 35374 1726882917.45134: results queue empty 35374 1726882917.45135: checking for any_errors_fatal 35374 1726882917.45139: done checking for any_errors_fatal 35374 1726882917.45140: checking for max_fail_percentage 35374 1726882917.45141: done checking for max_fail_percentage 35374 1726882917.45142: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.45143: done checking to see if all hosts have failed 35374 1726882917.45143: getting the remaining hosts for this loop 35374 1726882917.45144: done getting the remaining hosts for this loop 35374 1726882917.45147: getting the next task for host managed_node1 35374 1726882917.45152: done getting next task for host managed_node1 35374 1726882917.45155: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 35374 1726882917.45157: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882917.45171: getting variables 35374 1726882917.45172: in VariableManager get_vars() 35374 1726882917.45199: Calling all_inventory to load vars for managed_node1 35374 1726882917.45202: Calling groups_inventory to load vars for managed_node1 35374 1726882917.45205: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.45210: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.45212: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.45214: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.45313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.45435: done with get_vars() 35374 1726882917.45442: done getting variables 35374 1726882917.45483: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:41:57 -0400 (0:00:00.013) 0:00:04.033 ****** 35374 1726882917.45503: entering _queue_task() for managed_node1/copy 35374 1726882917.45644: worker is 1 (out of 1 available) 35374 1726882917.45655: exiting _queue_task() for managed_node1/copy 35374 1726882917.45667: done queuing things up, now waiting for results queue to drain 35374 1726882917.45669: waiting for pending results... 35374 1726882917.45809: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 35374 1726882917.45880: in run() - task 0e448fcc-3ce9-ee6a-9b8c-0000000000bf 35374 1726882917.45890: variable 'ansible_search_path' from source: unknown 35374 1726882917.45893: variable 'ansible_search_path' from source: unknown 35374 1726882917.45916: calling self._execute() 35374 1726882917.45971: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.45974: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.45984: variable 'omit' from source: magic vars 35374 1726882917.46230: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.46240: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.46317: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.46320: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.46323: when evaluation is False, skipping this task 35374 1726882917.46328: _execute() done 35374 1726882917.46331: dumping result to json 35374 1726882917.46333: done dumping result, returning 35374 1726882917.46350: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-ee6a-9b8c-0000000000bf] 35374 1726882917.46359: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000bf 35374 1726882917.46438: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000bf 35374 1726882917.46443: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882917.46491: no more pending results, returning what we have 35374 1726882917.46493: results queue empty 35374 1726882917.46494: checking for any_errors_fatal 35374 1726882917.46499: done checking for any_errors_fatal 35374 1726882917.46500: checking for max_fail_percentage 35374 1726882917.46501: done checking for max_fail_percentage 35374 1726882917.46502: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.46503: done checking to see if all hosts have failed 35374 1726882917.46503: getting the remaining hosts for this loop 35374 1726882917.46504: done getting the remaining hosts for this loop 35374 1726882917.46507: getting the next task for host managed_node1 35374 1726882917.46511: done getting next task for host managed_node1 35374 1726882917.46514: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 35374 1726882917.46517: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882917.46528: getting variables 35374 1726882917.46529: in VariableManager get_vars() 35374 1726882917.46570: Calling all_inventory to load vars for managed_node1 35374 1726882917.46572: Calling groups_inventory to load vars for managed_node1 35374 1726882917.46573: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.46581: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.46582: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.46584: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.46717: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.46838: done with get_vars() 35374 1726882917.46848: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:41:57 -0400 (0:00:00.013) 0:00:04.047 ****** 35374 1726882917.46904: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 35374 1726882917.47048: worker is 1 (out of 1 available) 35374 1726882917.47060: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 35374 1726882917.47074: done queuing things up, now waiting for results queue to drain 35374 1726882917.47075: waiting for pending results... 35374 1726882917.47206: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 35374 1726882917.47275: in run() - task 0e448fcc-3ce9-ee6a-9b8c-0000000000c0 35374 1726882917.47285: variable 'ansible_search_path' from source: unknown 35374 1726882917.47289: variable 'ansible_search_path' from source: unknown 35374 1726882917.47313: calling self._execute() 35374 1726882917.47370: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.47374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.47379: variable 'omit' from source: magic vars 35374 1726882917.47609: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.47620: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.47698: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.47701: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.47704: when evaluation is False, skipping this task 35374 1726882917.47707: _execute() done 35374 1726882917.47709: dumping result to json 35374 1726882917.47712: done dumping result, returning 35374 1726882917.47719: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-ee6a-9b8c-0000000000c0] 35374 1726882917.47724: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000c0 35374 1726882917.47812: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000c0 35374 1726882917.47815: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882917.47876: no more pending results, returning what we have 35374 1726882917.47878: results queue empty 35374 1726882917.47879: checking for any_errors_fatal 35374 1726882917.47882: done checking for any_errors_fatal 35374 1726882917.47883: checking for max_fail_percentage 35374 1726882917.47884: done checking for max_fail_percentage 35374 1726882917.47885: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.47886: done checking to see if all hosts have failed 35374 1726882917.47886: getting the remaining hosts for this loop 35374 1726882917.47887: done getting the remaining hosts for this loop 35374 1726882917.47890: getting the next task for host managed_node1 35374 1726882917.47895: done getting next task for host managed_node1 35374 1726882917.47898: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 35374 1726882917.47900: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882917.47912: getting variables 35374 1726882917.47914: in VariableManager get_vars() 35374 1726882917.47956: Calling all_inventory to load vars for managed_node1 35374 1726882917.47958: Calling groups_inventory to load vars for managed_node1 35374 1726882917.47960: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.47971: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.47974: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.47977: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.48135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.48337: done with get_vars() 35374 1726882917.48345: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:41:57 -0400 (0:00:00.015) 0:00:04.062 ****** 35374 1726882917.48419: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 35374 1726882917.48611: worker is 1 (out of 1 available) 35374 1726882917.48621: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 35374 1726882917.48631: done queuing things up, now waiting for results queue to drain 35374 1726882917.48632: waiting for pending results... 35374 1726882917.48875: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 35374 1726882917.48989: in run() - task 0e448fcc-3ce9-ee6a-9b8c-0000000000c1 35374 1726882917.49005: variable 'ansible_search_path' from source: unknown 35374 1726882917.49018: variable 'ansible_search_path' from source: unknown 35374 1726882917.49051: calling self._execute() 35374 1726882917.49135: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.49144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.49154: variable 'omit' from source: magic vars 35374 1726882917.49495: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.49514: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.49626: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.49635: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.49641: when evaluation is False, skipping this task 35374 1726882917.49646: _execute() done 35374 1726882917.49652: dumping result to json 35374 1726882917.49659: done dumping result, returning 35374 1726882917.49678: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-ee6a-9b8c-0000000000c1] 35374 1726882917.49687: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000c1 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882917.49816: no more pending results, returning what we have 35374 1726882917.49822: results queue empty 35374 1726882917.49823: checking for any_errors_fatal 35374 1726882917.49829: done checking for any_errors_fatal 35374 1726882917.49830: checking for max_fail_percentage 35374 1726882917.49832: done checking for max_fail_percentage 35374 1726882917.49833: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.49833: done checking to see if all hosts have failed 35374 1726882917.49834: getting the remaining hosts for this loop 35374 1726882917.49836: done getting the remaining hosts for this loop 35374 1726882917.49839: getting the next task for host managed_node1 35374 1726882917.49846: done getting next task for host managed_node1 35374 1726882917.49850: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 35374 1726882917.49853: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882917.49883: getting variables 35374 1726882917.49885: in VariableManager get_vars() 35374 1726882917.49926: Calling all_inventory to load vars for managed_node1 35374 1726882917.49929: Calling groups_inventory to load vars for managed_node1 35374 1726882917.49932: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.49939: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.49942: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.49944: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.50054: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000c1 35374 1726882917.50096: WORKER PROCESS EXITING 35374 1726882917.50107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.50225: done with get_vars() 35374 1726882917.50231: done getting variables 35374 1726882917.50270: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:41:57 -0400 (0:00:00.018) 0:00:04.081 ****** 35374 1726882917.50291: entering _queue_task() for managed_node1/debug 35374 1726882917.50441: worker is 1 (out of 1 available) 35374 1726882917.50455: exiting _queue_task() for managed_node1/debug 35374 1726882917.50470: done queuing things up, now waiting for results queue to drain 35374 1726882917.50472: waiting for pending results... 35374 1726882917.50607: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 35374 1726882917.50681: in run() - task 0e448fcc-3ce9-ee6a-9b8c-0000000000c2 35374 1726882917.50692: variable 'ansible_search_path' from source: unknown 35374 1726882917.50696: variable 'ansible_search_path' from source: unknown 35374 1726882917.50720: calling self._execute() 35374 1726882917.50777: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.50781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.50789: variable 'omit' from source: magic vars 35374 1726882917.51016: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.51026: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.51107: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.51110: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.51113: when evaluation is False, skipping this task 35374 1726882917.51116: _execute() done 35374 1726882917.51119: dumping result to json 35374 1726882917.51121: done dumping result, returning 35374 1726882917.51130: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-ee6a-9b8c-0000000000c2] 35374 1726882917.51135: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000c2 35374 1726882917.51218: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000c2 35374 1726882917.51221: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 35374 1726882917.51280: no more pending results, returning what we have 35374 1726882917.51282: results queue empty 35374 1726882917.51283: checking for any_errors_fatal 35374 1726882917.51288: done checking for any_errors_fatal 35374 1726882917.51288: checking for max_fail_percentage 35374 1726882917.51290: done checking for max_fail_percentage 35374 1726882917.51291: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.51292: done checking to see if all hosts have failed 35374 1726882917.51292: getting the remaining hosts for this loop 35374 1726882917.51293: done getting the remaining hosts for this loop 35374 1726882917.51296: getting the next task for host managed_node1 35374 1726882917.51299: done getting next task for host managed_node1 35374 1726882917.51302: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 35374 1726882917.51303: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882917.51313: getting variables 35374 1726882917.51314: in VariableManager get_vars() 35374 1726882917.51343: Calling all_inventory to load vars for managed_node1 35374 1726882917.51344: Calling groups_inventory to load vars for managed_node1 35374 1726882917.51346: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.51351: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.51353: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.51354: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.51453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.51578: done with get_vars() 35374 1726882917.51584: done getting variables 35374 1726882917.51621: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:41:57 -0400 (0:00:00.013) 0:00:04.095 ****** 35374 1726882917.51641: entering _queue_task() for managed_node1/debug 35374 1726882917.51797: worker is 1 (out of 1 available) 35374 1726882917.51810: exiting _queue_task() for managed_node1/debug 35374 1726882917.51821: done queuing things up, now waiting for results queue to drain 35374 1726882917.51822: waiting for pending results... 35374 1726882917.51956: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 35374 1726882917.52028: in run() - task 0e448fcc-3ce9-ee6a-9b8c-0000000000c3 35374 1726882917.52037: variable 'ansible_search_path' from source: unknown 35374 1726882917.52042: variable 'ansible_search_path' from source: unknown 35374 1726882917.52071: calling self._execute() 35374 1726882917.52121: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.52125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.52133: variable 'omit' from source: magic vars 35374 1726882917.52359: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.52370: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.52450: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.52453: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.52456: when evaluation is False, skipping this task 35374 1726882917.52459: _execute() done 35374 1726882917.52461: dumping result to json 35374 1726882917.52468: done dumping result, returning 35374 1726882917.52477: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-ee6a-9b8c-0000000000c3] 35374 1726882917.52486: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000c3 35374 1726882917.52563: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000c3 35374 1726882917.52567: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 35374 1726882917.52629: no more pending results, returning what we have 35374 1726882917.52632: results queue empty 35374 1726882917.52633: checking for any_errors_fatal 35374 1726882917.52637: done checking for any_errors_fatal 35374 1726882917.52638: checking for max_fail_percentage 35374 1726882917.52640: done checking for max_fail_percentage 35374 1726882917.52640: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.52641: done checking to see if all hosts have failed 35374 1726882917.52642: getting the remaining hosts for this loop 35374 1726882917.52643: done getting the remaining hosts for this loop 35374 1726882917.52646: getting the next task for host managed_node1 35374 1726882917.52650: done getting next task for host managed_node1 35374 1726882917.52652: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 35374 1726882917.52654: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882917.52666: getting variables 35374 1726882917.52667: in VariableManager get_vars() 35374 1726882917.52702: Calling all_inventory to load vars for managed_node1 35374 1726882917.52704: Calling groups_inventory to load vars for managed_node1 35374 1726882917.52705: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.52710: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.52712: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.52714: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.52844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.52960: done with get_vars() 35374 1726882917.52968: done getting variables 35374 1726882917.53003: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:41:57 -0400 (0:00:00.013) 0:00:04.108 ****** 35374 1726882917.53024: entering _queue_task() for managed_node1/debug 35374 1726882917.53171: worker is 1 (out of 1 available) 35374 1726882917.53184: exiting _queue_task() for managed_node1/debug 35374 1726882917.53195: done queuing things up, now waiting for results queue to drain 35374 1726882917.53196: waiting for pending results... 35374 1726882917.53333: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 35374 1726882917.53401: in run() - task 0e448fcc-3ce9-ee6a-9b8c-0000000000c4 35374 1726882917.53410: variable 'ansible_search_path' from source: unknown 35374 1726882917.53413: variable 'ansible_search_path' from source: unknown 35374 1726882917.53439: calling self._execute() 35374 1726882917.53496: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.53500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.53508: variable 'omit' from source: magic vars 35374 1726882917.53737: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.53747: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.53826: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.53830: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.53833: when evaluation is False, skipping this task 35374 1726882917.53836: _execute() done 35374 1726882917.53838: dumping result to json 35374 1726882917.53842: done dumping result, returning 35374 1726882917.53850: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-ee6a-9b8c-0000000000c4] 35374 1726882917.53853: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000c4 35374 1726882917.53935: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000c4 35374 1726882917.53937: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 35374 1726882917.53997: no more pending results, returning what we have 35374 1726882917.54000: results queue empty 35374 1726882917.54001: checking for any_errors_fatal 35374 1726882917.54005: done checking for any_errors_fatal 35374 1726882917.54006: checking for max_fail_percentage 35374 1726882917.54007: done checking for max_fail_percentage 35374 1726882917.54008: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.54008: done checking to see if all hosts have failed 35374 1726882917.54009: getting the remaining hosts for this loop 35374 1726882917.54010: done getting the remaining hosts for this loop 35374 1726882917.54012: getting the next task for host managed_node1 35374 1726882917.54015: done getting next task for host managed_node1 35374 1726882917.54018: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 35374 1726882917.54019: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882917.54029: getting variables 35374 1726882917.54030: in VariableManager get_vars() 35374 1726882917.54072: Calling all_inventory to load vars for managed_node1 35374 1726882917.54074: Calling groups_inventory to load vars for managed_node1 35374 1726882917.54076: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.54081: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.54083: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.54084: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.54185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.54303: done with get_vars() 35374 1726882917.54310: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:41:57 -0400 (0:00:00.013) 0:00:04.122 ****** 35374 1726882917.54368: entering _queue_task() for managed_node1/ping 35374 1726882917.54511: worker is 1 (out of 1 available) 35374 1726882917.54525: exiting _queue_task() for managed_node1/ping 35374 1726882917.54536: done queuing things up, now waiting for results queue to drain 35374 1726882917.54537: waiting for pending results... 35374 1726882917.54678: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 35374 1726882917.54743: in run() - task 0e448fcc-3ce9-ee6a-9b8c-0000000000c5 35374 1726882917.54753: variable 'ansible_search_path' from source: unknown 35374 1726882917.54758: variable 'ansible_search_path' from source: unknown 35374 1726882917.54786: calling self._execute() 35374 1726882917.54838: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.54841: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.54849: variable 'omit' from source: magic vars 35374 1726882917.55084: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.55093: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.55168: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.55174: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.55177: when evaluation is False, skipping this task 35374 1726882917.55180: _execute() done 35374 1726882917.55183: dumping result to json 35374 1726882917.55186: done dumping result, returning 35374 1726882917.55192: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-ee6a-9b8c-0000000000c5] 35374 1726882917.55199: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000c5 35374 1726882917.55274: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000c5 35374 1726882917.55277: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882917.55336: no more pending results, returning what we have 35374 1726882917.55338: results queue empty 35374 1726882917.55339: checking for any_errors_fatal 35374 1726882917.55343: done checking for any_errors_fatal 35374 1726882917.55344: checking for max_fail_percentage 35374 1726882917.55345: done checking for max_fail_percentage 35374 1726882917.55346: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.55346: done checking to see if all hosts have failed 35374 1726882917.55347: getting the remaining hosts for this loop 35374 1726882917.55348: done getting the remaining hosts for this loop 35374 1726882917.55349: getting the next task for host managed_node1 35374 1726882917.55354: done getting next task for host managed_node1 35374 1726882917.55356: ^ task is: TASK: meta (role_complete) 35374 1726882917.55358: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882917.55371: getting variables 35374 1726882917.55372: in VariableManager get_vars() 35374 1726882917.55404: Calling all_inventory to load vars for managed_node1 35374 1726882917.55406: Calling groups_inventory to load vars for managed_node1 35374 1726882917.55407: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.55413: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.55414: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.55416: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.55516: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.55661: done with get_vars() 35374 1726882917.55669: done getting variables 35374 1726882917.55720: done queuing things up, now waiting for results queue to drain 35374 1726882917.55722: results queue empty 35374 1726882917.55722: checking for any_errors_fatal 35374 1726882917.55724: done checking for any_errors_fatal 35374 1726882917.55724: checking for max_fail_percentage 35374 1726882917.55725: done checking for max_fail_percentage 35374 1726882917.55725: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.55725: done checking to see if all hosts have failed 35374 1726882917.55726: getting the remaining hosts for this loop 35374 1726882917.55726: done getting the remaining hosts for this loop 35374 1726882917.55728: getting the next task for host managed_node1 35374 1726882917.55732: done getting next task for host managed_node1 35374 1726882917.55734: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 35374 1726882917.55735: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 35374 1726882917.55741: getting variables 35374 1726882917.55742: in VariableManager get_vars() 35374 1726882917.55752: Calling all_inventory to load vars for managed_node1 35374 1726882917.55754: Calling groups_inventory to load vars for managed_node1 35374 1726882917.55755: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.55758: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.55759: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.55760: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.55839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.55951: done with get_vars() 35374 1726882917.55957: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:41:57 -0400 (0:00:00.016) 0:00:04.138 ****** 35374 1726882917.56005: entering _queue_task() for managed_node1/include_tasks 35374 1726882917.56152: worker is 1 (out of 1 available) 35374 1726882917.56165: exiting _queue_task() for managed_node1/include_tasks 35374 1726882917.56176: done queuing things up, now waiting for results queue to drain 35374 1726882917.56177: waiting for pending results... 35374 1726882917.56319: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 35374 1726882917.56397: in run() - task 0e448fcc-3ce9-ee6a-9b8c-0000000000fd 35374 1726882917.56405: variable 'ansible_search_path' from source: unknown 35374 1726882917.56409: variable 'ansible_search_path' from source: unknown 35374 1726882917.56432: calling self._execute() 35374 1726882917.56487: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.56491: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.56499: variable 'omit' from source: magic vars 35374 1726882917.56726: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.56735: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.56819: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.56824: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.56827: when evaluation is False, skipping this task 35374 1726882917.56830: _execute() done 35374 1726882917.56832: dumping result to json 35374 1726882917.56834: done dumping result, returning 35374 1726882917.56843: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-ee6a-9b8c-0000000000fd] 35374 1726882917.56848: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000fd 35374 1726882917.56931: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000fd 35374 1726882917.56934: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882917.56987: no more pending results, returning what we have 35374 1726882917.56990: results queue empty 35374 1726882917.56990: checking for any_errors_fatal 35374 1726882917.56992: done checking for any_errors_fatal 35374 1726882917.56992: checking for max_fail_percentage 35374 1726882917.56994: done checking for max_fail_percentage 35374 1726882917.56995: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.56996: done checking to see if all hosts have failed 35374 1726882917.56996: getting the remaining hosts for this loop 35374 1726882917.56998: done getting the remaining hosts for this loop 35374 1726882917.57001: getting the next task for host managed_node1 35374 1726882917.57006: done getting next task for host managed_node1 35374 1726882917.57009: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 35374 1726882917.57011: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 35374 1726882917.57021: getting variables 35374 1726882917.57022: in VariableManager get_vars() 35374 1726882917.57053: Calling all_inventory to load vars for managed_node1 35374 1726882917.57055: Calling groups_inventory to load vars for managed_node1 35374 1726882917.57056: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.57062: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.57065: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.57068: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.57170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.57311: done with get_vars() 35374 1726882917.57317: done getting variables 35374 1726882917.57352: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:41:57 -0400 (0:00:00.013) 0:00:04.152 ****** 35374 1726882917.57374: entering _queue_task() for managed_node1/debug 35374 1726882917.57521: worker is 1 (out of 1 available) 35374 1726882917.57534: exiting _queue_task() for managed_node1/debug 35374 1726882917.57544: done queuing things up, now waiting for results queue to drain 35374 1726882917.57546: waiting for pending results... 35374 1726882917.57686: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 35374 1726882917.57762: in run() - task 0e448fcc-3ce9-ee6a-9b8c-0000000000fe 35374 1726882917.57777: variable 'ansible_search_path' from source: unknown 35374 1726882917.57780: variable 'ansible_search_path' from source: unknown 35374 1726882917.57804: calling self._execute() 35374 1726882917.57853: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.57857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.57867: variable 'omit' from source: magic vars 35374 1726882917.58096: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.58106: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.58184: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.58187: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.58190: when evaluation is False, skipping this task 35374 1726882917.58193: _execute() done 35374 1726882917.58195: dumping result to json 35374 1726882917.58198: done dumping result, returning 35374 1726882917.58206: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-ee6a-9b8c-0000000000fe] 35374 1726882917.58209: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000fe 35374 1726882917.58289: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000fe 35374 1726882917.58292: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 35374 1726882917.58351: no more pending results, returning what we have 35374 1726882917.58354: results queue empty 35374 1726882917.58354: checking for any_errors_fatal 35374 1726882917.58358: done checking for any_errors_fatal 35374 1726882917.58359: checking for max_fail_percentage 35374 1726882917.58360: done checking for max_fail_percentage 35374 1726882917.58360: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.58361: done checking to see if all hosts have failed 35374 1726882917.58361: getting the remaining hosts for this loop 35374 1726882917.58362: done getting the remaining hosts for this loop 35374 1726882917.58366: getting the next task for host managed_node1 35374 1726882917.58370: done getting next task for host managed_node1 35374 1726882917.58373: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 35374 1726882917.58376: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 35374 1726882917.58386: getting variables 35374 1726882917.58387: in VariableManager get_vars() 35374 1726882917.58418: Calling all_inventory to load vars for managed_node1 35374 1726882917.58419: Calling groups_inventory to load vars for managed_node1 35374 1726882917.58421: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.58426: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.58428: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.58429: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.58529: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.58647: done with get_vars() 35374 1726882917.58654: done getting variables 35374 1726882917.58702: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:41:57 -0400 (0:00:00.013) 0:00:04.165 ****** 35374 1726882917.58722: entering _queue_task() for managed_node1/fail 35374 1726882917.58868: worker is 1 (out of 1 available) 35374 1726882917.58880: exiting _queue_task() for managed_node1/fail 35374 1726882917.58891: done queuing things up, now waiting for results queue to drain 35374 1726882917.58893: waiting for pending results... 35374 1726882917.59030: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 35374 1726882917.59111: in run() - task 0e448fcc-3ce9-ee6a-9b8c-0000000000ff 35374 1726882917.59119: variable 'ansible_search_path' from source: unknown 35374 1726882917.59123: variable 'ansible_search_path' from source: unknown 35374 1726882917.59148: calling self._execute() 35374 1726882917.59201: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.59204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.59212: variable 'omit' from source: magic vars 35374 1726882917.59436: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.59447: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.59526: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.59530: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.59532: when evaluation is False, skipping this task 35374 1726882917.59536: _execute() done 35374 1726882917.59538: dumping result to json 35374 1726882917.59542: done dumping result, returning 35374 1726882917.59549: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-ee6a-9b8c-0000000000ff] 35374 1726882917.59552: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000ff 35374 1726882917.59639: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-0000000000ff 35374 1726882917.59642: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882917.59696: no more pending results, returning what we have 35374 1726882917.59699: results queue empty 35374 1726882917.59700: checking for any_errors_fatal 35374 1726882917.59704: done checking for any_errors_fatal 35374 1726882917.59705: checking for max_fail_percentage 35374 1726882917.59706: done checking for max_fail_percentage 35374 1726882917.59706: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.59707: done checking to see if all hosts have failed 35374 1726882917.59707: getting the remaining hosts for this loop 35374 1726882917.59708: done getting the remaining hosts for this loop 35374 1726882917.59710: getting the next task for host managed_node1 35374 1726882917.59714: done getting next task for host managed_node1 35374 1726882917.59716: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 35374 1726882917.59718: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 35374 1726882917.59728: getting variables 35374 1726882917.59729: in VariableManager get_vars() 35374 1726882917.59760: Calling all_inventory to load vars for managed_node1 35374 1726882917.59762: Calling groups_inventory to load vars for managed_node1 35374 1726882917.59765: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.59771: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.59772: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.59774: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.59903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.60020: done with get_vars() 35374 1726882917.60026: done getting variables 35374 1726882917.60061: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:41:57 -0400 (0:00:00.013) 0:00:04.179 ****** 35374 1726882917.60084: entering _queue_task() for managed_node1/fail 35374 1726882917.60232: worker is 1 (out of 1 available) 35374 1726882917.60246: exiting _queue_task() for managed_node1/fail 35374 1726882917.60257: done queuing things up, now waiting for results queue to drain 35374 1726882917.60259: waiting for pending results... 35374 1726882917.60396: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 35374 1726882917.60470: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000100 35374 1726882917.60482: variable 'ansible_search_path' from source: unknown 35374 1726882917.60485: variable 'ansible_search_path' from source: unknown 35374 1726882917.60510: calling self._execute() 35374 1726882917.60559: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.60562: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.60575: variable 'omit' from source: magic vars 35374 1726882917.60802: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.60811: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.60889: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.60893: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.60895: when evaluation is False, skipping this task 35374 1726882917.60899: _execute() done 35374 1726882917.60901: dumping result to json 35374 1726882917.60904: done dumping result, returning 35374 1726882917.60913: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-ee6a-9b8c-000000000100] 35374 1726882917.60916: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000100 35374 1726882917.61005: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000100 35374 1726882917.61008: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882917.61057: no more pending results, returning what we have 35374 1726882917.61060: results queue empty 35374 1726882917.61061: checking for any_errors_fatal 35374 1726882917.61068: done checking for any_errors_fatal 35374 1726882917.61068: checking for max_fail_percentage 35374 1726882917.61070: done checking for max_fail_percentage 35374 1726882917.61070: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.61071: done checking to see if all hosts have failed 35374 1726882917.61071: getting the remaining hosts for this loop 35374 1726882917.61072: done getting the remaining hosts for this loop 35374 1726882917.61074: getting the next task for host managed_node1 35374 1726882917.61083: done getting next task for host managed_node1 35374 1726882917.61086: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 35374 1726882917.61088: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 35374 1726882917.61097: getting variables 35374 1726882917.61098: in VariableManager get_vars() 35374 1726882917.61125: Calling all_inventory to load vars for managed_node1 35374 1726882917.61126: Calling groups_inventory to load vars for managed_node1 35374 1726882917.61127: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.61132: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.61134: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.61136: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.61234: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.61351: done with get_vars() 35374 1726882917.61358: done getting variables 35374 1726882917.61394: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:41:57 -0400 (0:00:00.013) 0:00:04.192 ****** 35374 1726882917.61417: entering _queue_task() for managed_node1/fail 35374 1726882917.61566: worker is 1 (out of 1 available) 35374 1726882917.61579: exiting _queue_task() for managed_node1/fail 35374 1726882917.61591: done queuing things up, now waiting for results queue to drain 35374 1726882917.61592: waiting for pending results... 35374 1726882917.61730: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 35374 1726882917.61808: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000101 35374 1726882917.61817: variable 'ansible_search_path' from source: unknown 35374 1726882917.61820: variable 'ansible_search_path' from source: unknown 35374 1726882917.61849: calling self._execute() 35374 1726882917.61900: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.61903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.61911: variable 'omit' from source: magic vars 35374 1726882917.62138: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.62148: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.62227: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.62230: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.62233: when evaluation is False, skipping this task 35374 1726882917.62236: _execute() done 35374 1726882917.62238: dumping result to json 35374 1726882917.62242: done dumping result, returning 35374 1726882917.62250: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-ee6a-9b8c-000000000101] 35374 1726882917.62253: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000101 35374 1726882917.62337: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000101 35374 1726882917.62340: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882917.62396: no more pending results, returning what we have 35374 1726882917.62398: results queue empty 35374 1726882917.62399: checking for any_errors_fatal 35374 1726882917.62405: done checking for any_errors_fatal 35374 1726882917.62405: checking for max_fail_percentage 35374 1726882917.62406: done checking for max_fail_percentage 35374 1726882917.62407: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.62407: done checking to see if all hosts have failed 35374 1726882917.62408: getting the remaining hosts for this loop 35374 1726882917.62408: done getting the remaining hosts for this loop 35374 1726882917.62410: getting the next task for host managed_node1 35374 1726882917.62414: done getting next task for host managed_node1 35374 1726882917.62417: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 35374 1726882917.62419: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 35374 1726882917.62429: getting variables 35374 1726882917.62430: in VariableManager get_vars() 35374 1726882917.62461: Calling all_inventory to load vars for managed_node1 35374 1726882917.62462: Calling groups_inventory to load vars for managed_node1 35374 1726882917.62466: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.62472: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.62474: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.62475: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.62604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.62725: done with get_vars() 35374 1726882917.62732: done getting variables 35374 1726882917.62769: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:41:57 -0400 (0:00:00.013) 0:00:04.206 ****** 35374 1726882917.62790: entering _queue_task() for managed_node1/dnf 35374 1726882917.62952: worker is 1 (out of 1 available) 35374 1726882917.62967: exiting _queue_task() for managed_node1/dnf 35374 1726882917.62979: done queuing things up, now waiting for results queue to drain 35374 1726882917.62980: waiting for pending results... 35374 1726882917.63117: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 35374 1726882917.63211: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000102 35374 1726882917.63226: variable 'ansible_search_path' from source: unknown 35374 1726882917.63229: variable 'ansible_search_path' from source: unknown 35374 1726882917.63252: calling self._execute() 35374 1726882917.63306: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.63309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.63321: variable 'omit' from source: magic vars 35374 1726882917.63553: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.63564: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.63639: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.63643: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.63645: when evaluation is False, skipping this task 35374 1726882917.63650: _execute() done 35374 1726882917.63652: dumping result to json 35374 1726882917.63655: done dumping result, returning 35374 1726882917.63659: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-ee6a-9b8c-000000000102] 35374 1726882917.63675: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000102 35374 1726882917.63776: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000102 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882917.63820: no more pending results, returning what we have 35374 1726882917.63823: results queue empty 35374 1726882917.63824: checking for any_errors_fatal 35374 1726882917.63830: done checking for any_errors_fatal 35374 1726882917.63831: checking for max_fail_percentage 35374 1726882917.63832: done checking for max_fail_percentage 35374 1726882917.63833: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.63834: done checking to see if all hosts have failed 35374 1726882917.63835: getting the remaining hosts for this loop 35374 1726882917.63838: done getting the remaining hosts for this loop 35374 1726882917.63841: getting the next task for host managed_node1 35374 1726882917.63848: done getting next task for host managed_node1 35374 1726882917.63852: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 35374 1726882917.63856: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 35374 1726882917.63876: getting variables 35374 1726882917.63879: in VariableManager get_vars() 35374 1726882917.63920: Calling all_inventory to load vars for managed_node1 35374 1726882917.63923: Calling groups_inventory to load vars for managed_node1 35374 1726882917.63926: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.63936: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.63939: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.63942: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.64120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.64330: done with get_vars() 35374 1726882917.64341: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 35374 1726882917.64435: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:41:57 -0400 (0:00:00.018) 0:00:04.225 ****** 35374 1726882917.64636: entering _queue_task() for managed_node1/yum 35374 1726882917.64648: WORKER PROCESS EXITING 35374 1726882917.64837: worker is 1 (out of 1 available) 35374 1726882917.64848: exiting _queue_task() for managed_node1/yum 35374 1726882917.64860: done queuing things up, now waiting for results queue to drain 35374 1726882917.64861: waiting for pending results... 35374 1726882917.65096: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 35374 1726882917.65176: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000103 35374 1726882917.65196: variable 'ansible_search_path' from source: unknown 35374 1726882917.65205: variable 'ansible_search_path' from source: unknown 35374 1726882917.65235: calling self._execute() 35374 1726882917.65292: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.65299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.65310: variable 'omit' from source: magic vars 35374 1726882917.65559: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.65584: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.65659: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.65665: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.65673: when evaluation is False, skipping this task 35374 1726882917.65676: _execute() done 35374 1726882917.65678: dumping result to json 35374 1726882917.65682: done dumping result, returning 35374 1726882917.65689: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-ee6a-9b8c-000000000103] 35374 1726882917.65694: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000103 35374 1726882917.65781: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000103 35374 1726882917.65789: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882917.65834: no more pending results, returning what we have 35374 1726882917.65837: results queue empty 35374 1726882917.65837: checking for any_errors_fatal 35374 1726882917.65841: done checking for any_errors_fatal 35374 1726882917.65841: checking for max_fail_percentage 35374 1726882917.65843: done checking for max_fail_percentage 35374 1726882917.65844: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.65845: done checking to see if all hosts have failed 35374 1726882917.65845: getting the remaining hosts for this loop 35374 1726882917.65846: done getting the remaining hosts for this loop 35374 1726882917.65849: getting the next task for host managed_node1 35374 1726882917.65854: done getting next task for host managed_node1 35374 1726882917.65857: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 35374 1726882917.65860: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 35374 1726882917.65877: getting variables 35374 1726882917.65879: in VariableManager get_vars() 35374 1726882917.65907: Calling all_inventory to load vars for managed_node1 35374 1726882917.65909: Calling groups_inventory to load vars for managed_node1 35374 1726882917.65911: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.65916: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.65918: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.65919: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.66051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.66171: done with get_vars() 35374 1726882917.66178: done getting variables 35374 1726882917.66217: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:41:57 -0400 (0:00:00.016) 0:00:04.241 ****** 35374 1726882917.66238: entering _queue_task() for managed_node1/fail 35374 1726882917.66400: worker is 1 (out of 1 available) 35374 1726882917.66413: exiting _queue_task() for managed_node1/fail 35374 1726882917.66424: done queuing things up, now waiting for results queue to drain 35374 1726882917.66425: waiting for pending results... 35374 1726882917.66592: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 35374 1726882917.66680: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000104 35374 1726882917.66688: variable 'ansible_search_path' from source: unknown 35374 1726882917.66696: variable 'ansible_search_path' from source: unknown 35374 1726882917.66731: calling self._execute() 35374 1726882917.66803: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.66813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.66823: variable 'omit' from source: magic vars 35374 1726882917.67116: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.67126: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.67206: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.67209: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.67213: when evaluation is False, skipping this task 35374 1726882917.67216: _execute() done 35374 1726882917.67219: dumping result to json 35374 1726882917.67223: done dumping result, returning 35374 1726882917.67230: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-ee6a-9b8c-000000000104] 35374 1726882917.67235: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000104 35374 1726882917.67323: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000104 35374 1726882917.67326: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882917.67376: no more pending results, returning what we have 35374 1726882917.67378: results queue empty 35374 1726882917.67379: checking for any_errors_fatal 35374 1726882917.67383: done checking for any_errors_fatal 35374 1726882917.67384: checking for max_fail_percentage 35374 1726882917.67386: done checking for max_fail_percentage 35374 1726882917.67387: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.67387: done checking to see if all hosts have failed 35374 1726882917.67388: getting the remaining hosts for this loop 35374 1726882917.67389: done getting the remaining hosts for this loop 35374 1726882917.67392: getting the next task for host managed_node1 35374 1726882917.67397: done getting next task for host managed_node1 35374 1726882917.67400: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 35374 1726882917.67403: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 35374 1726882917.67416: getting variables 35374 1726882917.67417: in VariableManager get_vars() 35374 1726882917.67449: Calling all_inventory to load vars for managed_node1 35374 1726882917.67451: Calling groups_inventory to load vars for managed_node1 35374 1726882917.67452: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.67457: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.67459: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.67461: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.67562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.67703: done with get_vars() 35374 1726882917.67712: done getting variables 35374 1726882917.67762: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:41:57 -0400 (0:00:00.015) 0:00:04.256 ****** 35374 1726882917.67792: entering _queue_task() for managed_node1/package 35374 1726882917.67983: worker is 1 (out of 1 available) 35374 1726882917.67996: exiting _queue_task() for managed_node1/package 35374 1726882917.68007: done queuing things up, now waiting for results queue to drain 35374 1726882917.68008: waiting for pending results... 35374 1726882917.68237: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 35374 1726882917.68358: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000105 35374 1726882917.68379: variable 'ansible_search_path' from source: unknown 35374 1726882917.68385: variable 'ansible_search_path' from source: unknown 35374 1726882917.68419: calling self._execute() 35374 1726882917.68490: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.68499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.68509: variable 'omit' from source: magic vars 35374 1726882917.68772: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.68789: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.68858: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.68870: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.68875: when evaluation is False, skipping this task 35374 1726882917.68879: _execute() done 35374 1726882917.68881: dumping result to json 35374 1726882917.68885: done dumping result, returning 35374 1726882917.68891: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-ee6a-9b8c-000000000105] 35374 1726882917.68898: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000105 35374 1726882917.68983: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000105 35374 1726882917.68986: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882917.69041: no more pending results, returning what we have 35374 1726882917.69044: results queue empty 35374 1726882917.69045: checking for any_errors_fatal 35374 1726882917.69049: done checking for any_errors_fatal 35374 1726882917.69050: checking for max_fail_percentage 35374 1726882917.69051: done checking for max_fail_percentage 35374 1726882917.69052: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.69053: done checking to see if all hosts have failed 35374 1726882917.69053: getting the remaining hosts for this loop 35374 1726882917.69054: done getting the remaining hosts for this loop 35374 1726882917.69057: getting the next task for host managed_node1 35374 1726882917.69062: done getting next task for host managed_node1 35374 1726882917.69067: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 35374 1726882917.69071: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 35374 1726882917.69087: getting variables 35374 1726882917.69088: in VariableManager get_vars() 35374 1726882917.69120: Calling all_inventory to load vars for managed_node1 35374 1726882917.69122: Calling groups_inventory to load vars for managed_node1 35374 1726882917.69123: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.69129: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.69131: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.69132: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.69263: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.69381: done with get_vars() 35374 1726882917.69388: done getting variables 35374 1726882917.69427: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:41:57 -0400 (0:00:00.016) 0:00:04.273 ****** 35374 1726882917.69447: entering _queue_task() for managed_node1/package 35374 1726882917.69603: worker is 1 (out of 1 available) 35374 1726882917.69616: exiting _queue_task() for managed_node1/package 35374 1726882917.69627: done queuing things up, now waiting for results queue to drain 35374 1726882917.69628: waiting for pending results... 35374 1726882917.69774: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 35374 1726882917.69856: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000106 35374 1726882917.69868: variable 'ansible_search_path' from source: unknown 35374 1726882917.69874: variable 'ansible_search_path' from source: unknown 35374 1726882917.69899: calling self._execute() 35374 1726882917.69948: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.69951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.69961: variable 'omit' from source: magic vars 35374 1726882917.70199: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.70208: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.70285: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.70290: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.70293: when evaluation is False, skipping this task 35374 1726882917.70296: _execute() done 35374 1726882917.70300: dumping result to json 35374 1726882917.70303: done dumping result, returning 35374 1726882917.70310: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-ee6a-9b8c-000000000106] 35374 1726882917.70318: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000106 35374 1726882917.70403: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000106 35374 1726882917.70405: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882917.70458: no more pending results, returning what we have 35374 1726882917.70460: results queue empty 35374 1726882917.70461: checking for any_errors_fatal 35374 1726882917.70471: done checking for any_errors_fatal 35374 1726882917.70472: checking for max_fail_percentage 35374 1726882917.70473: done checking for max_fail_percentage 35374 1726882917.70474: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.70475: done checking to see if all hosts have failed 35374 1726882917.70476: getting the remaining hosts for this loop 35374 1726882917.70477: done getting the remaining hosts for this loop 35374 1726882917.70479: getting the next task for host managed_node1 35374 1726882917.70485: done getting next task for host managed_node1 35374 1726882917.70488: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 35374 1726882917.70491: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 35374 1726882917.70502: getting variables 35374 1726882917.70504: in VariableManager get_vars() 35374 1726882917.70533: Calling all_inventory to load vars for managed_node1 35374 1726882917.70535: Calling groups_inventory to load vars for managed_node1 35374 1726882917.70536: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.70541: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.70543: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.70544: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.70644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.70774: done with get_vars() 35374 1726882917.70782: done getting variables 35374 1726882917.70817: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:41:57 -0400 (0:00:00.013) 0:00:04.287 ****** 35374 1726882917.70839: entering _queue_task() for managed_node1/package 35374 1726882917.70994: worker is 1 (out of 1 available) 35374 1726882917.71006: exiting _queue_task() for managed_node1/package 35374 1726882917.71017: done queuing things up, now waiting for results queue to drain 35374 1726882917.71019: waiting for pending results... 35374 1726882917.71155: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 35374 1726882917.71230: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000107 35374 1726882917.71241: variable 'ansible_search_path' from source: unknown 35374 1726882917.71244: variable 'ansible_search_path' from source: unknown 35374 1726882917.71270: calling self._execute() 35374 1726882917.71327: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.71330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.71338: variable 'omit' from source: magic vars 35374 1726882917.71586: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.71597: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.71672: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.71684: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.71687: when evaluation is False, skipping this task 35374 1726882917.71690: _execute() done 35374 1726882917.71692: dumping result to json 35374 1726882917.71695: done dumping result, returning 35374 1726882917.71701: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-ee6a-9b8c-000000000107] 35374 1726882917.71711: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000107 35374 1726882917.71799: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000107 35374 1726882917.71802: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882917.71846: no more pending results, returning what we have 35374 1726882917.71849: results queue empty 35374 1726882917.71850: checking for any_errors_fatal 35374 1726882917.71854: done checking for any_errors_fatal 35374 1726882917.71854: checking for max_fail_percentage 35374 1726882917.71856: done checking for max_fail_percentage 35374 1726882917.71857: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.71858: done checking to see if all hosts have failed 35374 1726882917.71858: getting the remaining hosts for this loop 35374 1726882917.71859: done getting the remaining hosts for this loop 35374 1726882917.71862: getting the next task for host managed_node1 35374 1726882917.71871: done getting next task for host managed_node1 35374 1726882917.71874: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 35374 1726882917.71878: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 35374 1726882917.71889: getting variables 35374 1726882917.71890: in VariableManager get_vars() 35374 1726882917.71916: Calling all_inventory to load vars for managed_node1 35374 1726882917.71918: Calling groups_inventory to load vars for managed_node1 35374 1726882917.71921: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.71926: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.71928: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.71929: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.72197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.72314: done with get_vars() 35374 1726882917.72320: done getting variables 35374 1726882917.72356: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:41:57 -0400 (0:00:00.015) 0:00:04.302 ****** 35374 1726882917.72380: entering _queue_task() for managed_node1/service 35374 1726882917.72524: worker is 1 (out of 1 available) 35374 1726882917.72537: exiting _queue_task() for managed_node1/service 35374 1726882917.72548: done queuing things up, now waiting for results queue to drain 35374 1726882917.72549: waiting for pending results... 35374 1726882917.72692: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 35374 1726882917.72775: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000108 35374 1726882917.72786: variable 'ansible_search_path' from source: unknown 35374 1726882917.72789: variable 'ansible_search_path' from source: unknown 35374 1726882917.72816: calling self._execute() 35374 1726882917.72866: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.72872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.72880: variable 'omit' from source: magic vars 35374 1726882917.73123: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.73138: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.73225: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.73229: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.73232: when evaluation is False, skipping this task 35374 1726882917.73235: _execute() done 35374 1726882917.73239: dumping result to json 35374 1726882917.73243: done dumping result, returning 35374 1726882917.73247: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-ee6a-9b8c-000000000108] 35374 1726882917.73259: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000108 35374 1726882917.73342: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000108 35374 1726882917.73345: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882917.73386: no more pending results, returning what we have 35374 1726882917.73389: results queue empty 35374 1726882917.73390: checking for any_errors_fatal 35374 1726882917.73396: done checking for any_errors_fatal 35374 1726882917.73397: checking for max_fail_percentage 35374 1726882917.73398: done checking for max_fail_percentage 35374 1726882917.73399: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.73400: done checking to see if all hosts have failed 35374 1726882917.73401: getting the remaining hosts for this loop 35374 1726882917.73402: done getting the remaining hosts for this loop 35374 1726882917.73405: getting the next task for host managed_node1 35374 1726882917.73410: done getting next task for host managed_node1 35374 1726882917.73413: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 35374 1726882917.73416: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 35374 1726882917.73429: getting variables 35374 1726882917.73430: in VariableManager get_vars() 35374 1726882917.73458: Calling all_inventory to load vars for managed_node1 35374 1726882917.73460: Calling groups_inventory to load vars for managed_node1 35374 1726882917.73462: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.73473: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.73476: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.73480: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.73583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.73709: done with get_vars() 35374 1726882917.73716: done getting variables 35374 1726882917.73751: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:41:57 -0400 (0:00:00.013) 0:00:04.316 ****** 35374 1726882917.73771: entering _queue_task() for managed_node1/service 35374 1726882917.73914: worker is 1 (out of 1 available) 35374 1726882917.73927: exiting _queue_task() for managed_node1/service 35374 1726882917.73937: done queuing things up, now waiting for results queue to drain 35374 1726882917.73939: waiting for pending results... 35374 1726882917.74079: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 35374 1726882917.74154: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000109 35374 1726882917.74165: variable 'ansible_search_path' from source: unknown 35374 1726882917.74171: variable 'ansible_search_path' from source: unknown 35374 1726882917.74198: calling self._execute() 35374 1726882917.74247: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.74250: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.74257: variable 'omit' from source: magic vars 35374 1726882917.74504: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.74521: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.74633: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.74644: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.74650: when evaluation is False, skipping this task 35374 1726882917.74655: _execute() done 35374 1726882917.74661: dumping result to json 35374 1726882917.74682: done dumping result, returning 35374 1726882917.74694: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-ee6a-9b8c-000000000109] 35374 1726882917.74702: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000109 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 35374 1726882917.74825: no more pending results, returning what we have 35374 1726882917.74829: results queue empty 35374 1726882917.74830: checking for any_errors_fatal 35374 1726882917.74836: done checking for any_errors_fatal 35374 1726882917.74837: checking for max_fail_percentage 35374 1726882917.74839: done checking for max_fail_percentage 35374 1726882917.74839: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.74840: done checking to see if all hosts have failed 35374 1726882917.74841: getting the remaining hosts for this loop 35374 1726882917.74842: done getting the remaining hosts for this loop 35374 1726882917.74845: getting the next task for host managed_node1 35374 1726882917.74852: done getting next task for host managed_node1 35374 1726882917.74855: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 35374 1726882917.74860: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 35374 1726882917.74879: getting variables 35374 1726882917.74881: in VariableManager get_vars() 35374 1726882917.74927: Calling all_inventory to load vars for managed_node1 35374 1726882917.74929: Calling groups_inventory to load vars for managed_node1 35374 1726882917.74931: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.74940: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.74943: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.74946: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.75226: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000109 35374 1726882917.75229: WORKER PROCESS EXITING 35374 1726882917.75289: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.75501: done with get_vars() 35374 1726882917.75509: done getting variables 35374 1726882917.75561: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:41:57 -0400 (0:00:00.018) 0:00:04.334 ****** 35374 1726882917.75588: entering _queue_task() for managed_node1/service 35374 1726882917.75769: worker is 1 (out of 1 available) 35374 1726882917.75781: exiting _queue_task() for managed_node1/service 35374 1726882917.75791: done queuing things up, now waiting for results queue to drain 35374 1726882917.75793: waiting for pending results... 35374 1726882917.76019: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 35374 1726882917.76138: in run() - task 0e448fcc-3ce9-ee6a-9b8c-00000000010a 35374 1726882917.76154: variable 'ansible_search_path' from source: unknown 35374 1726882917.76160: variable 'ansible_search_path' from source: unknown 35374 1726882917.76194: calling self._execute() 35374 1726882917.76268: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.76278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.76289: variable 'omit' from source: magic vars 35374 1726882917.76586: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.76596: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.76671: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.76684: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.76687: when evaluation is False, skipping this task 35374 1726882917.76690: _execute() done 35374 1726882917.76693: dumping result to json 35374 1726882917.76696: done dumping result, returning 35374 1726882917.76701: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-ee6a-9b8c-00000000010a] 35374 1726882917.76706: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000010a 35374 1726882917.76791: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000010a 35374 1726882917.76794: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882917.76835: no more pending results, returning what we have 35374 1726882917.76838: results queue empty 35374 1726882917.76839: checking for any_errors_fatal 35374 1726882917.76842: done checking for any_errors_fatal 35374 1726882917.76843: checking for max_fail_percentage 35374 1726882917.76844: done checking for max_fail_percentage 35374 1726882917.76845: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.76846: done checking to see if all hosts have failed 35374 1726882917.76847: getting the remaining hosts for this loop 35374 1726882917.76848: done getting the remaining hosts for this loop 35374 1726882917.76850: getting the next task for host managed_node1 35374 1726882917.76855: done getting next task for host managed_node1 35374 1726882917.76858: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 35374 1726882917.76862: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 35374 1726882917.76876: getting variables 35374 1726882917.76877: in VariableManager get_vars() 35374 1726882917.76905: Calling all_inventory to load vars for managed_node1 35374 1726882917.76907: Calling groups_inventory to load vars for managed_node1 35374 1726882917.76909: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.76915: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.76916: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.76918: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.77020: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.77147: done with get_vars() 35374 1726882917.77154: done getting variables 35374 1726882917.77192: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:41:57 -0400 (0:00:00.016) 0:00:04.350 ****** 35374 1726882917.77211: entering _queue_task() for managed_node1/service 35374 1726882917.77357: worker is 1 (out of 1 available) 35374 1726882917.77374: exiting _queue_task() for managed_node1/service 35374 1726882917.77385: done queuing things up, now waiting for results queue to drain 35374 1726882917.77387: waiting for pending results... 35374 1726882917.77524: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 35374 1726882917.77596: in run() - task 0e448fcc-3ce9-ee6a-9b8c-00000000010b 35374 1726882917.77606: variable 'ansible_search_path' from source: unknown 35374 1726882917.77609: variable 'ansible_search_path' from source: unknown 35374 1726882917.77633: calling self._execute() 35374 1726882917.77688: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.77692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.77699: variable 'omit' from source: magic vars 35374 1726882917.77929: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.77941: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.78018: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.78021: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.78024: when evaluation is False, skipping this task 35374 1726882917.78027: _execute() done 35374 1726882917.78029: dumping result to json 35374 1726882917.78033: done dumping result, returning 35374 1726882917.78039: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-ee6a-9b8c-00000000010b] 35374 1726882917.78046: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000010b 35374 1726882917.78134: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000010b 35374 1726882917.78138: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 35374 1726882917.78188: no more pending results, returning what we have 35374 1726882917.78190: results queue empty 35374 1726882917.78191: checking for any_errors_fatal 35374 1726882917.78195: done checking for any_errors_fatal 35374 1726882917.78195: checking for max_fail_percentage 35374 1726882917.78196: done checking for max_fail_percentage 35374 1726882917.78197: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.78197: done checking to see if all hosts have failed 35374 1726882917.78198: getting the remaining hosts for this loop 35374 1726882917.78199: done getting the remaining hosts for this loop 35374 1726882917.78201: getting the next task for host managed_node1 35374 1726882917.78204: done getting next task for host managed_node1 35374 1726882917.78207: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 35374 1726882917.78214: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 35374 1726882917.78225: getting variables 35374 1726882917.78226: in VariableManager get_vars() 35374 1726882917.78252: Calling all_inventory to load vars for managed_node1 35374 1726882917.78254: Calling groups_inventory to load vars for managed_node1 35374 1726882917.78255: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.78260: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.78262: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.78265: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.78402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.78519: done with get_vars() 35374 1726882917.78526: done getting variables 35374 1726882917.78562: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:41:57 -0400 (0:00:00.013) 0:00:04.364 ****** 35374 1726882917.78587: entering _queue_task() for managed_node1/copy 35374 1726882917.78736: worker is 1 (out of 1 available) 35374 1726882917.78749: exiting _queue_task() for managed_node1/copy 35374 1726882917.78759: done queuing things up, now waiting for results queue to drain 35374 1726882917.78761: waiting for pending results... 35374 1726882917.78902: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 35374 1726882917.78978: in run() - task 0e448fcc-3ce9-ee6a-9b8c-00000000010c 35374 1726882917.78992: variable 'ansible_search_path' from source: unknown 35374 1726882917.78994: variable 'ansible_search_path' from source: unknown 35374 1726882917.79019: calling self._execute() 35374 1726882917.79070: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.79074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.79081: variable 'omit' from source: magic vars 35374 1726882917.79311: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.79322: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.79395: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.79399: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.79403: when evaluation is False, skipping this task 35374 1726882917.79406: _execute() done 35374 1726882917.79410: dumping result to json 35374 1726882917.79412: done dumping result, returning 35374 1726882917.79419: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-ee6a-9b8c-00000000010c] 35374 1726882917.79429: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000010c 35374 1726882917.79511: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000010c 35374 1726882917.79514: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882917.79570: no more pending results, returning what we have 35374 1726882917.79573: results queue empty 35374 1726882917.79574: checking for any_errors_fatal 35374 1726882917.79578: done checking for any_errors_fatal 35374 1726882917.79578: checking for max_fail_percentage 35374 1726882917.79580: done checking for max_fail_percentage 35374 1726882917.79581: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.79581: done checking to see if all hosts have failed 35374 1726882917.79582: getting the remaining hosts for this loop 35374 1726882917.79583: done getting the remaining hosts for this loop 35374 1726882917.79586: getting the next task for host managed_node1 35374 1726882917.79591: done getting next task for host managed_node1 35374 1726882917.79595: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 35374 1726882917.79598: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 35374 1726882917.79609: getting variables 35374 1726882917.79610: in VariableManager get_vars() 35374 1726882917.79640: Calling all_inventory to load vars for managed_node1 35374 1726882917.79642: Calling groups_inventory to load vars for managed_node1 35374 1726882917.79643: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.79648: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.79650: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.79651: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.79749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.79870: done with get_vars() 35374 1726882917.79878: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:41:57 -0400 (0:00:00.013) 0:00:04.377 ****** 35374 1726882917.79928: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 35374 1726882917.80071: worker is 1 (out of 1 available) 35374 1726882917.80082: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 35374 1726882917.80093: done queuing things up, now waiting for results queue to drain 35374 1726882917.80094: waiting for pending results... 35374 1726882917.80234: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 35374 1726882917.80307: in run() - task 0e448fcc-3ce9-ee6a-9b8c-00000000010d 35374 1726882917.80318: variable 'ansible_search_path' from source: unknown 35374 1726882917.80321: variable 'ansible_search_path' from source: unknown 35374 1726882917.80348: calling self._execute() 35374 1726882917.80406: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.80410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.80417: variable 'omit' from source: magic vars 35374 1726882917.80660: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.80673: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.80748: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.80751: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.80754: when evaluation is False, skipping this task 35374 1726882917.80756: _execute() done 35374 1726882917.80759: dumping result to json 35374 1726882917.80762: done dumping result, returning 35374 1726882917.80773: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-ee6a-9b8c-00000000010d] 35374 1726882917.80775: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000010d 35374 1726882917.80859: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000010d 35374 1726882917.80862: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882917.80916: no more pending results, returning what we have 35374 1726882917.80919: results queue empty 35374 1726882917.80920: checking for any_errors_fatal 35374 1726882917.80925: done checking for any_errors_fatal 35374 1726882917.80926: checking for max_fail_percentage 35374 1726882917.80927: done checking for max_fail_percentage 35374 1726882917.80928: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.80929: done checking to see if all hosts have failed 35374 1726882917.80930: getting the remaining hosts for this loop 35374 1726882917.80931: done getting the remaining hosts for this loop 35374 1726882917.80933: getting the next task for host managed_node1 35374 1726882917.80938: done getting next task for host managed_node1 35374 1726882917.80941: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 35374 1726882917.80944: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 35374 1726882917.80954: getting variables 35374 1726882917.80955: in VariableManager get_vars() 35374 1726882917.80990: Calling all_inventory to load vars for managed_node1 35374 1726882917.80992: Calling groups_inventory to load vars for managed_node1 35374 1726882917.80993: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.80999: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.81000: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.81002: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.81128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.81245: done with get_vars() 35374 1726882917.81251: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:41:57 -0400 (0:00:00.013) 0:00:04.391 ****** 35374 1726882917.81307: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 35374 1726882917.81446: worker is 1 (out of 1 available) 35374 1726882917.81457: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 35374 1726882917.81472: done queuing things up, now waiting for results queue to drain 35374 1726882917.81474: waiting for pending results... 35374 1726882917.81613: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 35374 1726882917.81686: in run() - task 0e448fcc-3ce9-ee6a-9b8c-00000000010e 35374 1726882917.81697: variable 'ansible_search_path' from source: unknown 35374 1726882917.81700: variable 'ansible_search_path' from source: unknown 35374 1726882917.81724: calling self._execute() 35374 1726882917.81780: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.81784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.81791: variable 'omit' from source: magic vars 35374 1726882917.82026: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.82035: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.82113: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.82116: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.82119: when evaluation is False, skipping this task 35374 1726882917.82121: _execute() done 35374 1726882917.82124: dumping result to json 35374 1726882917.82128: done dumping result, returning 35374 1726882917.82134: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-ee6a-9b8c-00000000010e] 35374 1726882917.82140: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000010e 35374 1726882917.82222: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000010e 35374 1726882917.82225: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882917.82291: no more pending results, returning what we have 35374 1726882917.82294: results queue empty 35374 1726882917.82295: checking for any_errors_fatal 35374 1726882917.82299: done checking for any_errors_fatal 35374 1726882917.82300: checking for max_fail_percentage 35374 1726882917.82301: done checking for max_fail_percentage 35374 1726882917.82302: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.82303: done checking to see if all hosts have failed 35374 1726882917.82303: getting the remaining hosts for this loop 35374 1726882917.82304: done getting the remaining hosts for this loop 35374 1726882917.82307: getting the next task for host managed_node1 35374 1726882917.82312: done getting next task for host managed_node1 35374 1726882917.82314: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 35374 1726882917.82318: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 35374 1726882917.82332: getting variables 35374 1726882917.82333: in VariableManager get_vars() 35374 1726882917.82363: Calling all_inventory to load vars for managed_node1 35374 1726882917.82366: Calling groups_inventory to load vars for managed_node1 35374 1726882917.82368: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.82374: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.82376: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.82378: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.82474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.82592: done with get_vars() 35374 1726882917.82600: done getting variables 35374 1726882917.82636: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:41:57 -0400 (0:00:00.013) 0:00:04.405 ****** 35374 1726882917.82657: entering _queue_task() for managed_node1/debug 35374 1726882917.82802: worker is 1 (out of 1 available) 35374 1726882917.82813: exiting _queue_task() for managed_node1/debug 35374 1726882917.82824: done queuing things up, now waiting for results queue to drain 35374 1726882917.82825: waiting for pending results... 35374 1726882917.82969: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 35374 1726882917.83041: in run() - task 0e448fcc-3ce9-ee6a-9b8c-00000000010f 35374 1726882917.83052: variable 'ansible_search_path' from source: unknown 35374 1726882917.83056: variable 'ansible_search_path' from source: unknown 35374 1726882917.83086: calling self._execute() 35374 1726882917.83136: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.83141: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.83146: variable 'omit' from source: magic vars 35374 1726882917.83409: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.83419: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.83497: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.83502: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.83505: when evaluation is False, skipping this task 35374 1726882917.83507: _execute() done 35374 1726882917.83510: dumping result to json 35374 1726882917.83514: done dumping result, returning 35374 1726882917.83521: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-ee6a-9b8c-00000000010f] 35374 1726882917.83526: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000010f 35374 1726882917.83610: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-00000000010f 35374 1726882917.83612: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 35374 1726882917.83655: no more pending results, returning what we have 35374 1726882917.83658: results queue empty 35374 1726882917.83659: checking for any_errors_fatal 35374 1726882917.83662: done checking for any_errors_fatal 35374 1726882917.83662: checking for max_fail_percentage 35374 1726882917.83666: done checking for max_fail_percentage 35374 1726882917.83667: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.83670: done checking to see if all hosts have failed 35374 1726882917.83670: getting the remaining hosts for this loop 35374 1726882917.83672: done getting the remaining hosts for this loop 35374 1726882917.83674: getting the next task for host managed_node1 35374 1726882917.83680: done getting next task for host managed_node1 35374 1726882917.83683: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 35374 1726882917.83686: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 35374 1726882917.83698: getting variables 35374 1726882917.83699: in VariableManager get_vars() 35374 1726882917.83730: Calling all_inventory to load vars for managed_node1 35374 1726882917.83732: Calling groups_inventory to load vars for managed_node1 35374 1726882917.83733: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.83739: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.83740: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.83742: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.83843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.83994: done with get_vars() 35374 1726882917.84000: done getting variables 35374 1726882917.84034: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:41:57 -0400 (0:00:00.013) 0:00:04.419 ****** 35374 1726882917.84055: entering _queue_task() for managed_node1/debug 35374 1726882917.84199: worker is 1 (out of 1 available) 35374 1726882917.84211: exiting _queue_task() for managed_node1/debug 35374 1726882917.84220: done queuing things up, now waiting for results queue to drain 35374 1726882917.84222: waiting for pending results... 35374 1726882917.84356: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 35374 1726882917.84430: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000110 35374 1726882917.84441: variable 'ansible_search_path' from source: unknown 35374 1726882917.84444: variable 'ansible_search_path' from source: unknown 35374 1726882917.84472: calling self._execute() 35374 1726882917.84521: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.84524: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.84532: variable 'omit' from source: magic vars 35374 1726882917.84757: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.84770: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.84844: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.84847: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.84850: when evaluation is False, skipping this task 35374 1726882917.84853: _execute() done 35374 1726882917.84856: dumping result to json 35374 1726882917.84858: done dumping result, returning 35374 1726882917.84867: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-ee6a-9b8c-000000000110] 35374 1726882917.84872: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000110 35374 1726882917.84949: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000110 35374 1726882917.84952: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 35374 1726882917.85020: no more pending results, returning what we have 35374 1726882917.85022: results queue empty 35374 1726882917.85022: checking for any_errors_fatal 35374 1726882917.85025: done checking for any_errors_fatal 35374 1726882917.85025: checking for max_fail_percentage 35374 1726882917.85026: done checking for max_fail_percentage 35374 1726882917.85027: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.85027: done checking to see if all hosts have failed 35374 1726882917.85028: getting the remaining hosts for this loop 35374 1726882917.85029: done getting the remaining hosts for this loop 35374 1726882917.85031: getting the next task for host managed_node1 35374 1726882917.85035: done getting next task for host managed_node1 35374 1726882917.85037: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 35374 1726882917.85040: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 35374 1726882917.85049: getting variables 35374 1726882917.85050: in VariableManager get_vars() 35374 1726882917.85086: Calling all_inventory to load vars for managed_node1 35374 1726882917.85088: Calling groups_inventory to load vars for managed_node1 35374 1726882917.85090: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.85095: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.85096: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.85098: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.85206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.85327: done with get_vars() 35374 1726882917.85334: done getting variables 35374 1726882917.85374: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:41:57 -0400 (0:00:00.013) 0:00:04.432 ****** 35374 1726882917.85396: entering _queue_task() for managed_node1/debug 35374 1726882917.85540: worker is 1 (out of 1 available) 35374 1726882917.85551: exiting _queue_task() for managed_node1/debug 35374 1726882917.85561: done queuing things up, now waiting for results queue to drain 35374 1726882917.85562: waiting for pending results... 35374 1726882917.85698: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 35374 1726882917.85770: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000111 35374 1726882917.85778: variable 'ansible_search_path' from source: unknown 35374 1726882917.85782: variable 'ansible_search_path' from source: unknown 35374 1726882917.85806: calling self._execute() 35374 1726882917.85857: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.85861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.85872: variable 'omit' from source: magic vars 35374 1726882917.86094: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.86104: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.86181: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.86184: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.86187: when evaluation is False, skipping this task 35374 1726882917.86191: _execute() done 35374 1726882917.86194: dumping result to json 35374 1726882917.86197: done dumping result, returning 35374 1726882917.86203: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-ee6a-9b8c-000000000111] 35374 1726882917.86208: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000111 35374 1726882917.86292: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000111 35374 1726882917.86294: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 35374 1726882917.86355: no more pending results, returning what we have 35374 1726882917.86357: results queue empty 35374 1726882917.86358: checking for any_errors_fatal 35374 1726882917.86361: done checking for any_errors_fatal 35374 1726882917.86361: checking for max_fail_percentage 35374 1726882917.86362: done checking for max_fail_percentage 35374 1726882917.86365: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.86365: done checking to see if all hosts have failed 35374 1726882917.86366: getting the remaining hosts for this loop 35374 1726882917.86367: done getting the remaining hosts for this loop 35374 1726882917.86371: getting the next task for host managed_node1 35374 1726882917.86375: done getting next task for host managed_node1 35374 1726882917.86377: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 35374 1726882917.86380: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 35374 1726882917.86390: getting variables 35374 1726882917.86391: in VariableManager get_vars() 35374 1726882917.86422: Calling all_inventory to load vars for managed_node1 35374 1726882917.86424: Calling groups_inventory to load vars for managed_node1 35374 1726882917.86425: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.86430: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.86432: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.86434: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.86569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.86686: done with get_vars() 35374 1726882917.86692: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:41:57 -0400 (0:00:00.013) 0:00:04.446 ****** 35374 1726882917.86750: entering _queue_task() for managed_node1/ping 35374 1726882917.86898: worker is 1 (out of 1 available) 35374 1726882917.86910: exiting _queue_task() for managed_node1/ping 35374 1726882917.86920: done queuing things up, now waiting for results queue to drain 35374 1726882917.86921: waiting for pending results... 35374 1726882917.87052: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 35374 1726882917.87125: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000112 35374 1726882917.87134: variable 'ansible_search_path' from source: unknown 35374 1726882917.87138: variable 'ansible_search_path' from source: unknown 35374 1726882917.87162: calling self._execute() 35374 1726882917.87214: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.87218: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.87226: variable 'omit' from source: magic vars 35374 1726882917.87449: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.87459: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.87537: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.87541: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.87543: when evaluation is False, skipping this task 35374 1726882917.87546: _execute() done 35374 1726882917.87549: dumping result to json 35374 1726882917.87553: done dumping result, returning 35374 1726882917.87559: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-ee6a-9b8c-000000000112] 35374 1726882917.87566: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000112 35374 1726882917.87639: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000112 35374 1726882917.87642: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882917.87712: no more pending results, returning what we have 35374 1726882917.87714: results queue empty 35374 1726882917.87714: checking for any_errors_fatal 35374 1726882917.87717: done checking for any_errors_fatal 35374 1726882917.87717: checking for max_fail_percentage 35374 1726882917.87718: done checking for max_fail_percentage 35374 1726882917.87719: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.87719: done checking to see if all hosts have failed 35374 1726882917.87720: getting the remaining hosts for this loop 35374 1726882917.87721: done getting the remaining hosts for this loop 35374 1726882917.87723: getting the next task for host managed_node1 35374 1726882917.87728: done getting next task for host managed_node1 35374 1726882917.87729: ^ task is: TASK: meta (role_complete) 35374 1726882917.87732: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 35374 1726882917.87743: getting variables 35374 1726882917.87744: in VariableManager get_vars() 35374 1726882917.87779: Calling all_inventory to load vars for managed_node1 35374 1726882917.87781: Calling groups_inventory to load vars for managed_node1 35374 1726882917.87783: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.87788: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.87790: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.87792: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.87893: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.88011: done with get_vars() 35374 1726882917.88019: done getting variables 35374 1726882917.88069: done queuing things up, now waiting for results queue to drain 35374 1726882917.88070: results queue empty 35374 1726882917.88071: checking for any_errors_fatal 35374 1726882917.88072: done checking for any_errors_fatal 35374 1726882917.88073: checking for max_fail_percentage 35374 1726882917.88074: done checking for max_fail_percentage 35374 1726882917.88074: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.88075: done checking to see if all hosts have failed 35374 1726882917.88075: getting the remaining hosts for this loop 35374 1726882917.88076: done getting the remaining hosts for this loop 35374 1726882917.88079: getting the next task for host managed_node1 35374 1726882917.88081: done getting next task for host managed_node1 35374 1726882917.88082: ^ task is: TASK: Include the task 'cleanup_mock_wifi.yml' 35374 1726882917.88084: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 35374 1726882917.88085: getting variables 35374 1726882917.88086: in VariableManager get_vars() 35374 1726882917.88096: Calling all_inventory to load vars for managed_node1 35374 1726882917.88098: Calling groups_inventory to load vars for managed_node1 35374 1726882917.88099: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.88102: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.88103: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.88105: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.88184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.88314: done with get_vars() 35374 1726882917.88319: done getting variables TASK [Include the task 'cleanup_mock_wifi.yml'] ******************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:96 Friday 20 September 2024 21:41:57 -0400 (0:00:00.016) 0:00:04.462 ****** 35374 1726882917.88360: entering _queue_task() for managed_node1/include_tasks 35374 1726882917.88516: worker is 1 (out of 1 available) 35374 1726882917.88530: exiting _queue_task() for managed_node1/include_tasks 35374 1726882917.88541: done queuing things up, now waiting for results queue to drain 35374 1726882917.88542: waiting for pending results... 35374 1726882917.88677: running TaskExecutor() for managed_node1/TASK: Include the task 'cleanup_mock_wifi.yml' 35374 1726882917.88733: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000142 35374 1726882917.88750: variable 'ansible_search_path' from source: unknown 35374 1726882917.88783: calling self._execute() 35374 1726882917.88836: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.88844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.88853: variable 'omit' from source: magic vars 35374 1726882917.89082: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.89093: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.89167: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.89175: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.89178: when evaluation is False, skipping this task 35374 1726882917.89181: _execute() done 35374 1726882917.89184: dumping result to json 35374 1726882917.89187: done dumping result, returning 35374 1726882917.89189: done running TaskExecutor() for managed_node1/TASK: Include the task 'cleanup_mock_wifi.yml' [0e448fcc-3ce9-ee6a-9b8c-000000000142] 35374 1726882917.89194: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000142 35374 1726882917.89280: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000142 35374 1726882917.89284: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882917.89338: no more pending results, returning what we have 35374 1726882917.89341: results queue empty 35374 1726882917.89342: checking for any_errors_fatal 35374 1726882917.89343: done checking for any_errors_fatal 35374 1726882917.89344: checking for max_fail_percentage 35374 1726882917.89345: done checking for max_fail_percentage 35374 1726882917.89346: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.89347: done checking to see if all hosts have failed 35374 1726882917.89348: getting the remaining hosts for this loop 35374 1726882917.89349: done getting the remaining hosts for this loop 35374 1726882917.89351: getting the next task for host managed_node1 35374 1726882917.89355: done getting next task for host managed_node1 35374 1726882917.89356: ^ task is: TASK: Verify network state restored to default 35374 1726882917.89359: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 35374 1726882917.89360: getting variables 35374 1726882917.89361: in VariableManager get_vars() 35374 1726882917.89393: Calling all_inventory to load vars for managed_node1 35374 1726882917.89395: Calling groups_inventory to load vars for managed_node1 35374 1726882917.89397: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.89402: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.89404: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.89406: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.89511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.89638: done with get_vars() 35374 1726882917.89646: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:98 Friday 20 September 2024 21:41:57 -0400 (0:00:00.013) 0:00:04.475 ****** 35374 1726882917.89705: entering _queue_task() for managed_node1/include_tasks 35374 1726882917.89852: worker is 1 (out of 1 available) 35374 1726882917.89866: exiting _queue_task() for managed_node1/include_tasks 35374 1726882917.89880: done queuing things up, now waiting for results queue to drain 35374 1726882917.89882: waiting for pending results... 35374 1726882917.90020: running TaskExecutor() for managed_node1/TASK: Verify network state restored to default 35374 1726882917.90081: in run() - task 0e448fcc-3ce9-ee6a-9b8c-000000000143 35374 1726882917.90088: variable 'ansible_search_path' from source: unknown 35374 1726882917.90113: calling self._execute() 35374 1726882917.90181: variable 'ansible_host' from source: host vars for 'managed_node1' 35374 1726882917.90184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 35374 1726882917.90191: variable 'omit' from source: magic vars 35374 1726882917.90443: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.90453: Evaluated conditional (ansible_distribution_major_version != '6'): True 35374 1726882917.90530: variable 'ansible_distribution_major_version' from source: facts 35374 1726882917.90535: Evaluated conditional (ansible_distribution_major_version == '7'): False 35374 1726882917.90543: when evaluation is False, skipping this task 35374 1726882917.90546: _execute() done 35374 1726882917.90548: dumping result to json 35374 1726882917.90553: done dumping result, returning 35374 1726882917.90571: done running TaskExecutor() for managed_node1/TASK: Verify network state restored to default [0e448fcc-3ce9-ee6a-9b8c-000000000143] 35374 1726882917.90574: sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000143 35374 1726882917.90650: done sending task result for task 0e448fcc-3ce9-ee6a-9b8c-000000000143 35374 1726882917.90653: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 35374 1726882917.90707: no more pending results, returning what we have 35374 1726882917.90709: results queue empty 35374 1726882917.90710: checking for any_errors_fatal 35374 1726882917.90715: done checking for any_errors_fatal 35374 1726882917.90716: checking for max_fail_percentage 35374 1726882917.90717: done checking for max_fail_percentage 35374 1726882917.90718: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.90719: done checking to see if all hosts have failed 35374 1726882917.90719: getting the remaining hosts for this loop 35374 1726882917.90721: done getting the remaining hosts for this loop 35374 1726882917.90724: getting the next task for host managed_node1 35374 1726882917.90730: done getting next task for host managed_node1 35374 1726882917.90732: ^ task is: TASK: meta (flush_handlers) 35374 1726882917.90734: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882917.90738: getting variables 35374 1726882917.90739: in VariableManager get_vars() 35374 1726882917.90770: Calling all_inventory to load vars for managed_node1 35374 1726882917.90773: Calling groups_inventory to load vars for managed_node1 35374 1726882917.90774: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.90781: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.90782: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.90784: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.90885: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.91038: done with get_vars() 35374 1726882917.91046: done getting variables 35374 1726882917.91094: in VariableManager get_vars() 35374 1726882917.91104: Calling all_inventory to load vars for managed_node1 35374 1726882917.91106: Calling groups_inventory to load vars for managed_node1 35374 1726882917.91107: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.91110: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.91111: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.91113: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.91192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.91313: done with get_vars() 35374 1726882917.91321: done queuing things up, now waiting for results queue to drain 35374 1726882917.91322: results queue empty 35374 1726882917.91322: checking for any_errors_fatal 35374 1726882917.91323: done checking for any_errors_fatal 35374 1726882917.91324: checking for max_fail_percentage 35374 1726882917.91324: done checking for max_fail_percentage 35374 1726882917.91325: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.91325: done checking to see if all hosts have failed 35374 1726882917.91326: getting the remaining hosts for this loop 35374 1726882917.91326: done getting the remaining hosts for this loop 35374 1726882917.91328: getting the next task for host managed_node1 35374 1726882917.91330: done getting next task for host managed_node1 35374 1726882917.91331: ^ task is: TASK: meta (flush_handlers) 35374 1726882917.91332: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882917.91333: getting variables 35374 1726882917.91333: in VariableManager get_vars() 35374 1726882917.91342: Calling all_inventory to load vars for managed_node1 35374 1726882917.91344: Calling groups_inventory to load vars for managed_node1 35374 1726882917.91345: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.91348: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.91349: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.91351: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.91430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.91538: done with get_vars() 35374 1726882917.91543: done getting variables 35374 1726882917.91586: in VariableManager get_vars() 35374 1726882917.91598: Calling all_inventory to load vars for managed_node1 35374 1726882917.91600: Calling groups_inventory to load vars for managed_node1 35374 1726882917.91601: Calling all_plugins_inventory to load vars for managed_node1 35374 1726882917.91605: Calling all_plugins_play to load vars for managed_node1 35374 1726882917.91606: Calling groups_plugins_inventory to load vars for managed_node1 35374 1726882917.91608: Calling groups_plugins_play to load vars for managed_node1 35374 1726882917.91706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 35374 1726882917.91827: done with get_vars() 35374 1726882917.91834: done queuing things up, now waiting for results queue to drain 35374 1726882917.91836: results queue empty 35374 1726882917.91836: checking for any_errors_fatal 35374 1726882917.91837: done checking for any_errors_fatal 35374 1726882917.91837: checking for max_fail_percentage 35374 1726882917.91838: done checking for max_fail_percentage 35374 1726882917.91838: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.91839: done checking to see if all hosts have failed 35374 1726882917.91839: getting the remaining hosts for this loop 35374 1726882917.91840: done getting the remaining hosts for this loop 35374 1726882917.91844: getting the next task for host managed_node1 35374 1726882917.91846: done getting next task for host managed_node1 35374 1726882917.91846: ^ task is: None 35374 1726882917.91847: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 35374 1726882917.91848: done queuing things up, now waiting for results queue to drain 35374 1726882917.91848: results queue empty 35374 1726882917.91849: checking for any_errors_fatal 35374 1726882917.91849: done checking for any_errors_fatal 35374 1726882917.91850: checking for max_fail_percentage 35374 1726882917.91850: done checking for max_fail_percentage 35374 1726882917.91851: checking to see if all hosts have failed and the running result is not ok 35374 1726882917.91851: done checking to see if all hosts have failed 35374 1726882917.91852: getting the next task for host managed_node1 35374 1726882917.91854: done getting next task for host managed_node1 35374 1726882917.91854: ^ task is: None 35374 1726882917.91855: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node1 : ok=6 changed=0 unreachable=0 failed=0 skipped=103 rescued=0 ignored=0 Friday 20 September 2024 21:41:57 -0400 (0:00:00.021) 0:00:04.497 ****** =============================================================================== Gathering Facts --------------------------------------------------------- 1.53s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:6 Check if system is ostree ----------------------------------------------- 0.63s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 fedora.linux_system_roles.network : Install packages -------------------- 0.05s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later --- 0.05s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Gather the minimum subset of ansible_facts required by the network role test --- 0.05s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces --- 0.05s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 fedora.linux_system_roles.network : Enable and start wpa_supplicant ----- 0.04s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Include the task 'enable_epel.yml' -------------------------------------- 0.04s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.04s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Set network provider to 'nm' -------------------------------------------- 0.04s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:13 fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces --- 0.04s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable --- 0.04s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Gathering Facts --------------------------------------------------------- 0.04s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:3 fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable --- 0.04s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces --- 0.04s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Copy client certs ------------------------------------------------------- 0.04s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:13 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.03s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces --- 0.03s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable --- 0.03s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 fedora.linux_system_roles.network : Enable network service -------------- 0.03s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 35374 1726882917.91924: RUNNING CLEANUP